Air Miles Calculator logo

How far is Prince George from Omaha, NE?

The distance between Omaha (Eppley Airfield) and Prince George (Prince George Airport) is 1509 miles / 2429 kilometers / 1312 nautical miles.

The driving distance from Omaha (OMA) to Prince George (YXS) is 1901 miles / 3059 kilometers, and travel time by car is about 35 hours 46 minutes.

Eppley Airfield – Prince George Airport

Distance arrow
1509
Miles
Distance arrow
2429
Kilometers
Distance arrow
1312
Nautical miles

Search flights

Distance from Omaha to Prince George

There are several ways to calculate the distance from Omaha to Prince George. Here are two standard methods:

Vincenty's formula (applied above)
  • 1509.338 miles
  • 2429.043 kilometers
  • 1311.579 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1506.447 miles
  • 2424.391 kilometers
  • 1309.067 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Omaha to Prince George?

The estimated flight time from Eppley Airfield to Prince George Airport is 3 hours and 21 minutes.

Flight carbon footprint between Eppley Airfield (OMA) and Prince George Airport (YXS)

On average, flying from Omaha to Prince George generates about 180 kg of CO2 per passenger, and 180 kilograms equals 397 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Omaha to Prince George

See the map of the shortest flight path between Eppley Airfield (OMA) and Prince George Airport (YXS).

Airport information

Origin Eppley Airfield
City: Omaha, NE
Country: United States Flag of United States
IATA Code: OMA
ICAO Code: KOMA
Coordinates: 41°18′11″N, 95°53′38″W
Destination Prince George Airport
City: Prince George
Country: Canada Flag of Canada
IATA Code: YXS
ICAO Code: CYXS
Coordinates: 53°53′21″N, 122°40′44″W