So, I am trying to see if something would be visible to someone standing on the surface of a planet or the top of mountain on it.
So, imagine a perfect sphere for the planet, then imagine a moon which orbits perpendicular to the line from the observer to the centre of the sphere.
Let's give a orbital distance of A for the moon, a observer height of $h$ for the observer, and a planetary radius of $R$.
How would you calculate the necessary height H required to see the moon, given a specific A and R? It would be helpful if a formula could be given for solving this.
I tried doing it myself, to find out the necessary height to see it despite the curve of the planet, but I just can't work out how to get it. So I decided to come ask here.
This isn't about the physics of it, this is just about the geometry and such. This also is not taking into account angular size of the moon.
I tried doing something said in the comments, but my result was just:
$h = \sqrt{2R^2 + 2Rh + h^2 + a^2 + 2aR} - R$
From $R + h = \sqrt{R^2 + (R+a)^2}$
Not sure what other work to show. And it is hard to use mathjax on mobile. This however, seems wrong as it makes it so $a < h$. Which seems wrong.
I also tried:
$ R + h = \sqrt{R^2 + R^2}$ which got $ h = \sqrt{2} R - R$, which seems more reasonable, as $h < a$. But not sure if the math works out, especially as it doesn't take $a$ into account.
I could be missing something obvious. But I just want help for finding if something would be visible given the variables. I don't think what I am doing now helps with that.