# Bomb landing prediction problem

#### RisingFury

##### OBSP developer
As part of the work on OBSP, I'm working on a function capable of predicting the landing positions of bombs dropped from a plane, or shells fired from an artillery.

While the equations are an extreme pain in the butt that's not the problem, the implementation seems to be.

I started by making a function that neglects the effects of air resistance, just to make sure the trivial bits are working, but they're not.

I calculate the time the bomb would need to drop to the ground and mark it as t. The starting longitude and latitude are marked as Lon and Lat. The planet's radius is r and the longitudinal and latitudinal velocities (in m/s) are marked as x and y.

x and y are obtained from GetHorizonVelocityVector's x and z components.

The predicted longitude and latitude are then calculated:

Predicted lon = Lon + (x / r * t)
Predicted lat = Lat + (y / r * t)

To visually mark the position on the ground, I wrote code that places a DeltaGlider on that spot on Earth. That's when I noticed the problem...

When flying heading 0°, 90°, 180°or 270°, the predicted position is right under the velocity vector - as you'd expect. But when you fly any other heading, there is error. The error is greatest when flying 45°, 135°, 225° or 315° and measures just under 4°.

Here's a screenshot:

The value in the bottom right of the screen is the heading of the velocity vector.

When flying heading from 0° to 90° or from 180° to 270°, the predicted position is to the left of the velocity vector. But when flying headings from 90° to 180° or from 270°to 0°, the predicted position is to the right.

Why does this error occur? If it was due to the spin of the Earth, the error would have been greatest at 0° or 180°. It can't be due to the Coriolis effect because the planet's radius is constant...

So, any ideas?

#### Urwumpe

##### Not funny anymore
Donator
It is due to the Coriolis effect, this is always used in artillery tables. It is not only cause by change in radius, but also by non-tangential motion. I remember the problems with it in the Atlas-F project, before I included a primitive correction for coriolis effect, the warhead went far off the target.

Last edited:

#### jedidia

##### shoemaker without legs
So, any ideas?

You could just drop a bomb and see if it lands where your algorithm predicted. If yes, everything's fine

#### RisingFury

##### OBSP developer
Can't be Coriolis. The DeltaGlider on the ground does not signify where the bomb fell, just where the function predicts it will fall. I didn't model Coriolis in my prediction. If I actually drop the bomb, it lands right under the velocity vector, not to any side.

All I did was:
Predicted lon = Lon + (x / r * t)
Predicted lat = Lat + (y / r * t)

#### Urwumpe

##### Not funny anymore
Donator
In global coordinates? Including global velocity?

#### RisingFury

##### OBSP developer
In global coordinates? Including global velocity?

GetHorizonAirspeedVector() returns a velocity vector in local horizon coordinates, where x is longitudinal component, y is vertical component and z is latitudinal component. They're all in m/s.

I turn them into angular velocities by dividing them by the planet's radius (or the radius of the vessel).

If I print out the velocity vector's x and z components when I'm flying heading 45°, they're the same. Which means that the change in longitude and latitude should also be the same, which means that the angle from predicted position to current position, relative vessel-north should also be 45°. But it's not... it's some 41°.

---------- Post added 9th Aug 2010 at 02:10 ---------- Previous post was 8th Aug 2010 at 19:38 ----------

After watching StarGate all over again all night, I think I found my mistake.

Let's say you have a vessel at 0°, 0° and a target at 5°, 5°, those being equatorial coordinates. The angle between north-vessel-target is NOT 45°. I assumed it would be.

I'll have to write code that takes the current airspeed heading and predicts the location based on that...

#### Keith

##### Donator
Donator
After watching StarGate all over again all night, I think I found my mistake.

Let's say you have a vessel at 0°, 0° and a target at 5°, 5°, those being equatorial coordinates. The angle between north-vessel-target is NOT 45°. I assumed it would be.

I'll have to write code that takes the current airspeed heading and predicts the location based on that...

Put that way, it sounds almost like meridional convergence, well known feature in navigation. The distances between two meridians need to be corrected by the cosine of the latitude.

#### RisingFury

##### OBSP developer
Ok. I figured out a solution and am posting it for reference.

You have a velocity vector heading marked as h, your current equatorial coordinates are marked x for longitude and y for latitude. Predicted longitude and latitude are marked as px and py respectively.

The distance the bomb will drop can be calculated based on various parameters and formulas. I'm marking the distance the bomb will fly as l and planet's radius as r.

Predicted latitude is then given as
$Py = acos(cos(l/r) * cos((Pi/2) - y) + sin(l/r) * sin((Pi/2) - y) * cos(h))$

The difference in longitude is given as

$dx = asin(sin(l/r) * sin(h) / sin((Pi/2) - Py))$

And from that the latitude is then given as

$Px = x + dx$