How far is Naxos from Kharkiv?
The distance between Kharkiv (Kharkiv International Airport) and Naxos (Naxos Island National Airport) is 1040 miles / 1674 kilometers / 904 nautical miles.
The driving distance from Kharkiv (HRK) to Naxos (JNX) is 1694 miles / 2727 kilometers, and travel time by car is about 52 hours 52 minutes.
Kharkiv International Airport – Naxos Island National Airport
Search flights
Distance from Kharkiv to Naxos
There are several ways to calculate the distance from Kharkiv to Naxos. Here are two standard methods:
Vincenty's formula (applied above)- 1040.340 miles
- 1674.266 kilometers
- 904.031 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1040.203 miles
- 1674.045 kilometers
- 903.912 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Kharkiv to Naxos?
The estimated flight time from Kharkiv International Airport to Naxos Island National Airport is 2 hours and 28 minutes.
What is the time difference between Kharkiv and Naxos?
Flight carbon footprint between Kharkiv International Airport (HRK) and Naxos Island National Airport (JNX)
On average, flying from Kharkiv to Naxos generates about 153 kg of CO2 per passenger, and 153 kilograms equals 338 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Kharkiv to Naxos
See the map of the shortest flight path between Kharkiv International Airport (HRK) and Naxos Island National Airport (JNX).
Airport information
Origin | Kharkiv International Airport |
---|---|
City: | Kharkiv |
Country: | Ukraine |
IATA Code: | HRK |
ICAO Code: | UKHH |
Coordinates: | 49°55′29″N, 36°17′24″E |
Destination | Naxos Island National Airport |
---|---|
City: | Naxos |
Country: | Greece |
IATA Code: | JNX |
ICAO Code: | LGNX |
Coordinates: | 37°4′51″N, 25°22′5″E |