While the equations are an extreme pain in the butt that's not the problem, the implementation seems to be.

I started by making a function that neglects the effects of air resistance, just to make sure the trivial bits are working, but they're not.

I calculate the time the bomb would need to drop to the ground and mark it as t. The starting longitude and latitude are marked as Lon and Lat. The planet's radius is r and the longitudinal and latitudinal velocities (in m/s) are marked as x and y.

x and y are obtained from GetHorizonVelocityVector's x and z components.

The predicted longitude and latitude are then calculated:

Predicted lon = Lon + (x / r * t)

Predicted lat = Lat + (y / r * t)

To visually mark the position on the ground, I wrote code that places a DeltaGlider on that spot on Earth. That's when I noticed the problem...

When flying heading 0°, 90°, 180°or 270°, the predicted position is right under the velocity vector - as you'd expect. But when you fly any other heading, there is error. The error is greatest when flying 45°, 135°, 225° or 315° and measures just under 4°.

Here's a screenshot:

The value in the bottom right of the screen is the heading of the velocity vector.

When flying heading from 0° to 90° or from 180° to 270°, the predicted position is to the left of the velocity vector. But when flying headings from 90° to 180° or from 270°to 0°, the predicted position is to the right.

Why does this error occur? If it was due to the spin of the Earth, the error would have been greatest at 0° or 180°. It can't be due to the Coriolis effect because the planet's radius is constant...

So, any ideas?