How far is Naxos from Kirov?
The distance between Kirov (Pobedilovo Airport) and Naxos (Naxos Island National Airport) is 1833 miles / 2949 kilometers / 1593 nautical miles.
The driving distance from Kirov (KVX) to Naxos (JNX) is 2866 miles / 4612 kilometers, and travel time by car is about 73 hours 3 minutes.
Pobedilovo Airport – Naxos Island National Airport
Search flights
Distance from Kirov to Naxos
There are several ways to calculate the distance from Kirov to Naxos. Here are two standard methods:
Vincenty's formula (applied above)- 1832.682 miles
- 2949.416 kilometers
- 1592.557 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1830.938 miles
- 2946.609 kilometers
- 1591.042 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Kirov to Naxos?
The estimated flight time from Pobedilovo Airport to Naxos Island National Airport is 3 hours and 58 minutes.
What is the time difference between Kirov and Naxos?
The time difference between Kirov and Naxos is 1 hour. Naxos is 1 hour behind Kirov.
Flight carbon footprint between Pobedilovo Airport (KVX) and Naxos Island National Airport (JNX)
On average, flying from Kirov to Naxos generates about 203 kg of CO2 per passenger, and 203 kilograms equals 447 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Kirov to Naxos
See the map of the shortest flight path between Pobedilovo Airport (KVX) and Naxos Island National Airport (JNX).
Airport information
Origin | Pobedilovo Airport |
---|---|
City: | Kirov |
Country: | Russia |
IATA Code: | KVX |
ICAO Code: | USKK |
Coordinates: | 58°30′11″N, 49°20′53″E |
Destination | Naxos Island National Airport |
---|---|
City: | Naxos |
Country: | Greece |
IATA Code: | JNX |
ICAO Code: | LGNX |
Coordinates: | 37°4′51″N, 25°22′5″E |