Air Miles Calculator logo

How far is Saskatoon from Nagoya?

The distance between Nagoya (Nagoya Airfield) and Saskatoon (Saskatoon John G. Diefenbaker International Airport) is 5299 miles / 8528 kilometers / 4605 nautical miles.

Nagoya Airfield – Saskatoon John G. Diefenbaker International Airport

Distance arrow
5299
Miles
Distance arrow
8528
Kilometers
Distance arrow
4605
Nautical miles

Search flights

Distance from Nagoya to Saskatoon

There are several ways to calculate the distance from Nagoya to Saskatoon. Here are two standard methods:

Vincenty's formula (applied above)
  • 5298.820 miles
  • 8527.624 kilometers
  • 4604.549 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5285.773 miles
  • 8506.627 kilometers
  • 4593.211 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nagoya to Saskatoon?

The estimated flight time from Nagoya Airfield to Saskatoon John G. Diefenbaker International Airport is 10 hours and 31 minutes.

Flight carbon footprint between Nagoya Airfield (NKM) and Saskatoon John G. Diefenbaker International Airport (YXE)

On average, flying from Nagoya to Saskatoon generates about 623 kg of CO2 per passenger, and 623 kilograms equals 1 373 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Nagoya to Saskatoon

See the map of the shortest flight path between Nagoya Airfield (NKM) and Saskatoon John G. Diefenbaker International Airport (YXE).

Airport information

Origin Nagoya Airfield
City: Nagoya
Country: Japan Flag of Japan
IATA Code: NKM
ICAO Code: RJNA
Coordinates: 35°15′18″N, 136°55′26″E
Destination Saskatoon John G. Diefenbaker International Airport
City: Saskatoon
Country: Canada Flag of Canada
IATA Code: YXE
ICAO Code: CYXE
Coordinates: 52°10′14″N, 106°41′59″W