Air Miles Calculator logo

How far is Naxos from Nikolayevsk-on-Amur?

The distance between Nikolayevsk-on-Amur (Nikolayevsk-on-Amur Airport) and Naxos (Naxos Island National Airport) is 5115 miles / 8233 kilometers / 4445 nautical miles.

Nikolayevsk-on-Amur Airport – Naxos Island National Airport

Distance arrow
5115
Miles
Distance arrow
8233
Kilometers
Distance arrow
4445
Nautical miles

Search flights

Distance from Nikolayevsk-on-Amur to Naxos

There are several ways to calculate the distance from Nikolayevsk-on-Amur to Naxos. Here are two standard methods:

Vincenty's formula (applied above)
  • 5115.451 miles
  • 8232.521 kilometers
  • 4445.206 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5102.335 miles
  • 8211.412 kilometers
  • 4433.808 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nikolayevsk-on-Amur to Naxos?

The estimated flight time from Nikolayevsk-on-Amur Airport to Naxos Island National Airport is 10 hours and 11 minutes.

Flight carbon footprint between Nikolayevsk-on-Amur Airport (NLI) and Naxos Island National Airport (JNX)

On average, flying from Nikolayevsk-on-Amur to Naxos generates about 599 kg of CO2 per passenger, and 599 kilograms equals 1 320 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Nikolayevsk-on-Amur to Naxos

See the map of the shortest flight path between Nikolayevsk-on-Amur Airport (NLI) and Naxos Island National Airport (JNX).

Airport information

Origin Nikolayevsk-on-Amur Airport
City: Nikolayevsk-on-Amur
Country: Russia Flag of Russia
IATA Code: NLI
ICAO Code: UHNN
Coordinates: 53°9′17″N, 140°38′59″E
Destination Naxos Island National Airport
City: Naxos
Country: Greece Flag of Greece
IATA Code: JNX
ICAO Code: LGNX
Coordinates: 37°4′51″N, 25°22′5″E