Air Miles Calculator logo

How far is Gods Lake Narrows from Nanjing?

The distance between Nanjing (Nanjing Lukou International Airport) and Gods Lake Narrows (Gods Lake Narrows Airport) is 6166 miles / 9924 kilometers / 5358 nautical miles.

Nanjing Lukou International Airport – Gods Lake Narrows Airport

Distance arrow
6166
Miles
Distance arrow
9924
Kilometers
Distance arrow
5358
Nautical miles

Search flights

Distance from Nanjing to Gods Lake Narrows

There are several ways to calculate the distance from Nanjing to Gods Lake Narrows. Here are two standard methods:

Vincenty's formula (applied above)
  • 6166.371 miles
  • 9923.812 kilometers
  • 5358.430 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6152.275 miles
  • 9901.127 kilometers
  • 5346.181 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nanjing to Gods Lake Narrows?

The estimated flight time from Nanjing Lukou International Airport to Gods Lake Narrows Airport is 12 hours and 10 minutes.

Flight carbon footprint between Nanjing Lukou International Airport (NKG) and Gods Lake Narrows Airport (YGO)

On average, flying from Nanjing to Gods Lake Narrows generates about 739 kg of CO2 per passenger, and 739 kilograms equals 1 630 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Nanjing to Gods Lake Narrows

See the map of the shortest flight path between Nanjing Lukou International Airport (NKG) and Gods Lake Narrows Airport (YGO).

Airport information

Origin Nanjing Lukou International Airport
City: Nanjing
Country: China Flag of China
IATA Code: NKG
ICAO Code: ZSNJ
Coordinates: 31°44′31″N, 118°51′43″E
Destination Gods Lake Narrows Airport
City: Gods Lake Narrows
Country: Canada Flag of Canada
IATA Code: YGO
ICAO Code: CYGO
Coordinates: 54°33′32″N, 94°29′29″W