Show this function is associative (or provide a counter example)

Click For Summary
The discussion revolves around proving the associativity of the function f(x,y) = x√(1+y²) + y√(1+x²). Participants are attempting to show that f(x,f(y,z)) equals f(f(x,y),z) for real numbers x, y, and z. While numerical experiments suggest the function behaves associatively, a formal algebraic proof remains elusive, with attempts involving expansion and squaring yielding complex results. The use of computer algebra systems like Maple has provided some numerical confirmations, but symbolic proof appears challenging. Ultimately, the conversation highlights the difficulty of proving associativity for this specific function, with suggestions for alternative approaches and the potential for geometric interpretations.
  • #31
Ray Vickson said:
Success at last! Letting F1 = f(x,f(y,z)) and F2 = f(f(x,y),z), F1 becomes the sum of two terms with square-roots inside square roots. In Maple, we can extract the first one as L1 = op(1,F1). Since we are trying to prove F1 = F2, let's follow the suggestion by 'micromass' and compare L1 with R1 = F2 - op(2,F1); these ought to be equal, because in Maple we have F1 = op(1,F1)+op(2,F1). Leaving off the question of signs, square and expand both L1 and R1. We find that RR1 = R1^2 consists of 11 separate terms, of which two involve the same square-roots-within square-roots, but with different outside factors---both negative. The exact way to extract these depends on where they occur in the expression tree, but one can list all 11 terms through S = seq(op(i,RR1),i=1..11); in one session the two terms of interest were S[5] and S[11]. Writing RR1 = RR1a -S[5]-S[11], we want to prove that S[5]+S[11] = RR1a - L1^2. Now the right-hand-side no longer has square-roots inside square-roots. When we again square and compare (S[5]+S[11])^2 with (RR1a-L1^2)^2, we find the difference = 0!

Note that the exact way of extracting the wanted terms may be session-dependent and/or machine-dependent, etc., so success depends on a mix of human and machine processing, with the human determining what terms to group together. Getting a machine to do this unaided might be almost beyond the capabilities of current software.

In principle, all the work above could be done manually, if you have a couple of free days to spare dong algebra.

Of course, since we compared squares with squares, there are still sign considerations to worry about, but those seem comparatively simple to deal with.

Wow that's epic, thank you for taking such a deep look at it.
 
Physics news on Phys.org
  • #32
Ah, I get it, finally! All of the squares and square roots were really just hiding the fact that your ##f(x,y)## can be written as ##g(g^{-1}(x)+g^{-1}(y))## for some other function ##g(x)##. This confers the magic 'associative' property.
 
Last edited:
  • #33
Hmm. Has everyone stopped following this thread? There's a really easy way and there's really hard way. The constrast beween the two is pretty severe.
 
  • #34
I haven't had the time or inclination to work very much on this, but I admit I am curious.
 
  • #35
LCKurtz said:
I haven't had the time or inclination to work very much on this, but I admit I am curious.

Just writing it as f(x,y)=sinh(arcsinh(x)+arcsinh(y)) (following micromass's hint) makes it immediately fall apart. The only part that's even a little hard is keeping track of the parentheses.
 
Last edited:
  • #36
I guess what I'm really after here is asking whether anybody else realizes how fantastically brilliant micromass's suggestion is, especially following Ray Vickson's heroically hard but straightforward solution. It also let's you construct lots of other associative functions that would be equally hard to prove the direct way. Using micromass's suggestion, the first time I worked it out the proof fell out so quickly I didn't even see how it happened, it was that quick. I had to replay it a few times before I really understood it. No real 'calculation' was even needed, much less an intensive computer algebra project. It's that good.
 
  • #37
Just got back with a bit of time to look at it again. I fully agree with Dick at the brilliance of Micromass's observation. And good work to you Dick to observe observation about the general ##g(g^{-1}(x)+g^{-1}(y))## form.
 
  • #38
Dick said:
Something like that. Notice if you put x=sinh(a) and y=sinh(b) then sinh(a+b) is equal to your f(x,y). So f(x,y)=sinh(asinh(x)+asinh(y)).

Dick said:
Hmm. Has everyone stopped following this thread? There's a really easy way and there's really hard way. The constrast beween the two is pretty severe.

Dick said:
Ah, I get it, finally! All of the squares and square roots were really just hiding the fact that your ##f(x,y)## can be written as ##g(g^{-1}(x)+g^{-1}(y))## for some other function ##g(x)##. This confers the magic 'associative' property.

Ah thanks for the tips! I tend not to have internet access over the weekends as I'm at work from Friday afternoon through Sunday night, and I don't have internet there. Also, I've been afraid to put more time on this problem as I have 5 other problems I have not yet solved and I was afraid I'd have nothing to show my professor! I will definitely get back to these hints as soon as I complete the other problems!
 
  • #39
ArcanaNoir said:
Ah thanks for the tips! I tend not to have internet access over the weekends as I'm at work from Friday afternoon through Sunday night, and I don't have internet there. Also, I've been afraid to put more time on this problem as I have 5 other problems I have not yet solved and I was afraid I'd have nothing to show my professor! I will definitely get back to these hints as soon as I complete the other problems!

Definitely come back to this. Once you've wrapped your head around the hints, you might find it's the easiest of the problems. Wouldn't that be a nice surprise?
 
Last edited:
  • #40
Dick said:
Just writing it as f(x,y)=sinh(arcsinh(x)+arcsinh(y)) (following micromass's hint) makes it immediately fall apart. The only part that's even a little hard is keeping track of the parentheses.

Okay maybe I'm dense, but I don't see how f(x,y)=sinh(arcsinh(x)+arcsinh(y)). Here is what I did to try to get it to come out:

\sinh (\arcsinh (x) + \arcsinh (y) )\\<br /> = \sinh (\ln (x+\sqrt{1+x^2}) + \ln (y+\sqrt{1+y^2} ) ) \\<br /> = \sinh (\ln [(x+\sqrt{1+x^2})(y+\sqrt{1+y^2} )]) \\<br /> =\frac{e^{\ln \text{stuff}} - e^{-\ln \text{stuff}}}{2} \\<br /> =\frac{1}{2} ((x+\sqrt{1+x^2})(y+\sqrt{1+y^2} )-\frac{1}{(x+\sqrt{1+x^2})(y+\sqrt{1+y^2} )})

And I have tried some manipulations after that but never get it to simplify to be f(x,y).
Am I using the wrong identity?
 
  • #41
Dick said:
Definitely come back to this. Once you've wrapped your head around the hints, you might find it's the easiest of the problems. Wouldn't that be a nice surprise?

Dick, you might be interested in this:

http://www.researchgate.net/publication/2121594_On_the_Possible_Monoid_Structures_of_the_Natural_Numbers_N_or_Finding_All_Associative_Binary_Operations_on_N

Look at equation 3.2 and proposition 3.1. As is usual, someone has thought about this stuff before. It looks to me like if you use R instead of N, let ##f(x,y) = x+y## and ##\omega=\sinh##, the result follows from the associativity of addition. And, as you have observed, there are other bijections ##\omega##.
 
Last edited by a moderator:
  • #42
ArcanaNoir said:
Okay maybe I'm dense, but I don't see how f(x,y)=sinh(arcsinh(x)+arcsinh(y)). Here is what I did to try to get it to come out:

\sinh (\arcsinh (x) + \arcsinh (y) )\\<br /> = \sinh (\ln (x+\sqrt{1+x^2}) + \ln (y+\sqrt{1+y^2} ) ) \\<br /> = \sinh (\ln [(x+\sqrt{1+x^2})(y+\sqrt{1+y^2} )]) \\<br /> =\frac{e^{\ln \text{stuff}} - e^{-\ln \text{stuff}}}{2} \\<br /> =\frac{1}{2} ((x+\sqrt{1+x^2})(y+\sqrt{1+y^2} )-\frac{1}{(x+\sqrt{1+x^2})(y+\sqrt{1+y^2} )})

And I have tried some manipulations after that but never get it to simplify to be f(x,y).
Am I using the wrong identity?

Use micromass's hint. And that sinh(arcsinh(x))=x and cosh(arcsinh(x))=sqrt(1+x^2).
 
  • #43
LCKurtz said:
Dick, you might be interested in this:

http://www.researchgate.net/publication/2121594_On_the_Possible_Monoid_Structures_of_the_Natural_Numbers_N_or_Finding_All_Associative_Binary_Operations_on_N

Look at equation 3.2 and proposition 3.1. As is usual, someone has thought about this stuff before. It looks to me like if you use R instead of N, let ##f(x,y) = x+y## and ##\omega=\sinh##, the result follows from the associativity of addition. And, as you have observed, there are other bijections ##\omega##.

Thanks for the reference! I'll try and look at it once I get past ResearchGate's signup procedure...
 
Last edited by a moderator:
  • #44
Dick said:
Thanks for the reference! I'll try and look at it once I get past ResearchGate's signup procedure...

Close the pop up and you should see the PDF link on the right hand side of the page.
 
  • #45
ArcanaNoir said:
Close the pop up and you should see the PDF link on the right hand side of the page.

Thanks, but I don't see it. I think I may have to have an account and be logged in. But that's ok, I found the same paper on arXiv.org.
 
  • #46
ArcanaNoir said:
Okay maybe I'm dense, but I don't see how f(x,y)=sinh(arcsinh(x)+arcsinh(y)).
...

And I have tried some manipulations after that but never get it to simplify to be f(x,y).
Am I using the wrong identity?
Try looking at it this way:

Suppose that given x and y, there exist real numbers, a and b, such that
x = sinh(a)

and

y = sinh(b) .​

Then f(x,y) is becomes the following:

##\displaystyle f(x,\,y)=f(\sinh(a),\,\sinh(b))##
##\displaystyle = \sinh(a)\sqrt{1+\sinh^2(b)}+\sinh(b)\sqrt{1+\sinh^2(a)}
##​

Then use one of the Pythagorean identities to get rid of the radical.

After that use an argument addition identity (analogous to angle addition for trig functions).
 
Last edited:
  • #47
SammyS said:
Try looking at it this way:

Suppose that given x and y, there exist real numbers, a and b, such that
x = sinh(a)

and

y = sinh(b) .​

Then f(x,y) is becomes the following:

##\displaystyle f(x,\,y)=f(\sinh(a),\,\sinh(b))##
##\displaystyle = \sinh(a)\sqrt{1+\sinh^2(b)}+\sinh(b)\sqrt{1+\sinh^2(a)}
##​

Then use then one of the Pythagorean identities to get rid of the radical.

After that use an argument addition identity (analogous to angle addition for trig functions).

I'm not sure that's terribly illuminating, I'd just say following micromass's hint, ##\sinh(arcsinh(x)+arcsinh(y))=\sinh(arcsinh(x)) \cosh(arcsinh(y))+\cosh(arcsinh(x)) \sinh(arcsinh(y))##. Using ##\cosh(arcsinh(x))=\sqrt{1+x^2}## this seems pretty straightforward. Sorry I can't find a texy way to do the inverse hyperbolics.
 
Last edited:
  • #48
LCKurtz said:
http://www.researchgate.net/publication/2121594_On_the_Possible_Monoid_Structures_of_the_Natural_Numbers_N_or_Finding_All_Associative_Binary_Operations_on_N

Look at equation 3.2 and proposition 3.1. As is usual, someone has thought about this stuff before. It looks to me like if you use R instead of N, let ##f(x,y) = x+y## and ##\omega=\sinh##, the result follows from the associativity of addition. And, as you have observed, there are other bijections ##\omega##.

I mused as to whether there's an entire field (functions on operators) that could open up here (or maybe already has). As has been noted in this thread, if f is invertible S→S and ° is an operator S×S→S then f induces an operator °f by x °f y = f((f-1(x)°f-1(y)). If ° is associative, so is °f. Likewise commutativity. More surprisingly perhaps, if ° distributes across another operator °' then °f distributes across °'f.
If f is linear then it's rather trivial: °f = °. Some other examples:
f(x) = ex:
+f = × (or, by abuse of notation, e+ = ×). Using the same shorthand:
x e× y = xln(y) = yln(x)
f(x) = √x:
x √+ y = √(x2+y2)​
f(x) = tan(x):
x tan(+) y = (x+y)/(1-xy)​
It's suggestive, but needs a deeper result to make it interesting.
 
Last edited by a moderator:
  • #49
haruspex said:
I mused as to whether there's an entire field (functions on operators) that could open up here (or maybe already has). As has been noted in this thread, if f is invertible S→S and ° is an operator S×S→S then f induces an operator °f by x °f y = f((f-1(x)°f-1(y)). If ° is associative, so is °f. Likewise commutativity. More surprisingly perhaps, if ° distributes across another operator °' then °f distributes across °'f.
If f is linear then it's rather trivial: °f = °. Some other examples:
f(x) = ex:
+f = × (or, by abuse of notation, e+ = ×). Using the same shorthand:
x e× y = xln(y) = yln(x)
f(x) = √x:
x √+ y = √(x2+y2)​
f(x) = tan(x):
x tan(+) y = (x+y)/(1-xy)​
It's suggestive, but needs a deeper result to make it interesting.

Probably already has been thought about, that's what the monoids paper was about. I'm more interested in whether ArcanaNoir can figure out that this is a pretty easy problem with the right clues.
 
  • #50
Dick said:
I'm not sure that's terribly illuminating, I'd just say following micromass's hint, ##\sinh(arcsinh(x)+arcsinh(y))=\sinh(arcsinh(x)) \cosh(arcsinh(y))+\cosh(arcsinh(x)) \sinh(arcsinh(y))##. Using ##\cosh(arcsinh(x))=\sqrt{1+x^2}## this seems pretty straightforward. Sorry I can't find a texy way to do the inverse hyperbolics.
Well, it does give ##\displaystyle \ f(x,y)=\sinh(\text{arcsinh}(x)+\text{arcsinh}(y)) \ ## rather easily.

To get texy inverse hyperbolics I use

\text{arcsinh} .
 
  • #51
SammyS said:
Well, it does give ##\displaystyle \ f(x,y)=\sinh(\text{arcsinh}(x)+\text{arcsinh}(y)) \ ## rather easily.

To get texy inverse hyperbolics I use

\text{arcsinh} .

Yeah, I see what you are doing now. And thanks for the texy tip.
 
  • #52
Just confirmed with the professor, hyperbolic trig is the way to go. *still trying to get it*
 
  • #53
ArcanaNoir said:
Just confirmed with the professor, hyperbolic trig is the way to go. *still trying to get it*

Use ##\cosh^2 a - \sinh^2 a = 1##.

Btw, according to wiki, ##\text{arcsinh}## is a misnomer. It should be ##\text{arsinh}##.
 
  • #54
Okay, you guys have been wonderful, especially Dick and Micro. Although I'm beginning to suspect Micro is a genius, in which case he sneezes and answers pop out of him, so it's not like he has to try really hard. :P j/k I appreciate your tireless efforts Micro! (not kidding about thinking you're a genius...)
Anyway I'm trying to use Micro's hint but I feel like I'm slipping past the part where I'm supposed to rearrange the expressions to show they are equivalent. I will show my simplification for the expressions, maybe someone can point out where I was supposed to do something trig-y.

I have verified that \cosh (\sinh ^{-1}(x))=\sqrt{1+x^2} and that f(x,y)=\sinh (\sinh ^{-1}(x)+\sinh ^{-1}(y))

A:
(x\ast y)\ast z = \sinh (\sinh ^{-1}(x)+\sinh ^{-1}(y))\ast z \\ <br /> = \sinh (\sinh ^{-1} [\sinh (\sinh ^{-1}(x)+\sinh ^{-1}(y))]+\sinh ^{-1} (z)) \\<br /> =\sinh [\sinh ^{-1} [\sinh (\sinh ^{-1}(x)+\sinh ^{-1}(y))]]\cosh (\sinh ^{-1} (z))+\\<br /> \cosh [\sinh ^{-1} [\sinh (\sinh ^{-1}(x)+\sinh ^{-1}(y))]]\sinh (\sinh ^{-1} (z)) \\ <br /> =\sinh [\sinh ^{-1} [\sinh (\sinh ^{-1} (x))\cosh (\sinh ^{-1} (y))+\cosh (\sinh ^{-1} (x))\sinh (\sinh ^{-1} (y))]]\cdot \sqrt{1+z^2} +\\<br /> \cosh [\sinh ^{-1} [ \sinh (\sinh ^{-1} (x))\cosh (\sinh ^{-1}(y))+\cosh (\sinh ^{-1} (x))\sinh (\sinh ^{-1} (y))]]\cdot z \\<br /> \sinh [\sinh ^{-1} [x\sqrt{1+y^2}+y\sqrt{1+x^2}]]\sqrt{1+z^2}+\cosh [\sinh ^{-1} [x\sqrt{1+y^2}+y\sqrt{1+x^2}]]z

B:
x\ast (y\ast z) = \sinh (\sinh ^{-1} (x)+\sinh ^{-1}(y\ast z)) \\<br /> =\sinh (\sinh ^{-1}(x)+\sinh ^{-1} [\sinh (\sinh ^{-1}(y) + \sinh ^{-1}(z))]) \\<br /> =\sinh [\sinh ^{-1} (x) + \sinh ^{-1} [\sinh (\sinh ^{-1}(y))\cosh (\sinh ^{-1} (z))+\\<br /> \cosh (\sinh ^{-1} (y))\sinh (\sinh ^{-1} (z))]] \\<br /> =\sinh [ \sinh ^{-1}(x)+\sinh ^{-1} [y\sqrt{1+z^2}+z\sqrt{1+y^2}]] \\<br /> =\sinh (\sinh ^{-1} (x))\cosh (\sinh ^{-1} [y\sqrt{1+z^2}+z\sqrt{1+y^2}])+\\<br /> \cosh (\sinh ^{-1} (x))\sinh (\sinh ^{-1}[y\sqrt{1+z^2}+z\sqrt{1+y^2}]) \\<br /> =x\sqrt{1+(y\sqrt{1+z^2}+z\sqrt{1+y^2})^2}+[y\sqrt{1+z^2}+z\sqrt{1+y^2}]\sqrt{1+x^2}

So it seems to me that this isn't going to end any better than when I didn't use hyp. trig, Hence why I think I'm missing the critical point.
 
Last edited:
  • #55
You've already typed too much. Stop! Look at your second line. The first argument to the outer sinh is arcsinh(sinh('something')). Just replace that with 'something'. You are overshooting!
 
  • #56
Dick said:
You've already typed too much. Stop! Look at your second line. The first argument to the outer sinh is arcsinh(sinh('something')). Just replace that with 'something'. You are overshooting!

whoops! lol didnt see you down here, was just trying to fix my tex :) *looking at second line*

Umm maybe I don't see what you mean, I can't simplify directly when there is a sum involved, right?
 
Last edited:
  • #57
ArcanaNoir said:
B:
x\ast (y\ast z) = \sinh (\sinh ^{-1} (x)+\sinh ^{-1}(y\ast z)) \\<br /> =\sinh (\sinh ^{-1}(x)+\sinh ^{-1} [\sinh (\sinh ^{-1}(y) + \sinh ^{-1}(z))]) \\<br /> =\sinh [\sinh ^{-1} (x) + \sinh ^{-1} [\sinh (\sinh ^{-1}(y))\cosh (\sinh ^{-1} (z))+\cosh (\sinh ^{-1} (y))\sinh (\sinh ^{-1} (z))]]

Le's go into a different direction:

B:
x\ast (y\ast z) = \sinh (\sinh ^{-1} (x)+\sinh ^{-1}(y\ast z)) \\<br /> =\sinh (\sinh ^{-1}(x)+\sinh ^{-1} [\sinh (\sinh ^{-1}(y) + \sinh ^{-1}(z))]) \\<br /> =\sinh (\sinh ^{-1}(x)+ \sinh ^{-1}(y) + \sinh ^{-1}(z))

EDIT: Ah. too late.
 
  • #58
ArcanaNoir said:
whoops! lol didnt see you down here, was just trying to fix my tex :) *looking at second line*

Umm maybe I don't see what you mean, I can't simplify directly when there is a sum involved, right?

arcsinh(sinh( arcsinh(x)+arcsinh(y) ))=arcsinh(x)+arcsinh(y) is what I and ILS mean.
 
  • #59
I like Serena said:
Le's go into a different direction:

B:
x\ast (y\ast z) = \sinh (\sinh ^{-1} (x)+\sinh ^{-1}(y\ast z)) \\<br /> =\sinh (\sinh ^{-1}(x)+\sinh ^{-1} [\sinh (\sinh ^{-1}(y) + \sinh ^{-1}(z))]) \\<br /> =\sinh (\sinh ^{-1}(x)+ \sinh ^{-1}(y) + \sinh ^{-1}(z))

EDIT: Ah. too late.

how did you get from your second line to your third line?
 
  • #60
Dick said:
arcsinh(sinh( arcsinh(x)+arcsinh(y) ))=arcsinh(x)+arcsinh(y) is what I and ILS mean.

What identity are you using?
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 14 ·
Replies
14
Views
3K
Replies
4
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 23 ·
Replies
23
Views
2K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
8
Views
4K