Air Miles Calculator logo

How far is Naxos from Ust-Kuyga?

The distance between Ust-Kuyga (Ust-Kuyga Airport) and Naxos (Naxos Island National Airport) is 4282 miles / 6892 kilometers / 3721 nautical miles.

Ust-Kuyga Airport – Naxos Island National Airport

Distance arrow
4282
Miles
Distance arrow
6892
Kilometers
Distance arrow
3721
Nautical miles

Search flights

Distance from Ust-Kuyga to Naxos

There are several ways to calculate the distance from Ust-Kuyga to Naxos. Here are two standard methods:

Vincenty's formula (applied above)
  • 4282.481 miles
  • 6891.985 kilometers
  • 3721.374 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4271.947 miles
  • 6875.032 kilometers
  • 3712.220 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ust-Kuyga to Naxos?

The estimated flight time from Ust-Kuyga Airport to Naxos Island National Airport is 8 hours and 36 minutes.

Flight carbon footprint between Ust-Kuyga Airport (UKG) and Naxos Island National Airport (JNX)

On average, flying from Ust-Kuyga to Naxos generates about 492 kg of CO2 per passenger, and 492 kilograms equals 1 084 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Ust-Kuyga to Naxos

See the map of the shortest flight path between Ust-Kuyga Airport (UKG) and Naxos Island National Airport (JNX).

Airport information

Origin Ust-Kuyga Airport
City: Ust-Kuyga
Country: Russia Flag of Russia
IATA Code: UKG
ICAO Code: UEBT
Coordinates: 70°0′39″N, 135°38′42″E
Destination Naxos Island National Airport
City: Naxos
Country: Greece Flag of Greece
IATA Code: JNX
ICAO Code: LGNX
Coordinates: 37°4′51″N, 25°22′5″E