How far is Lublin from Naxos?
The distance between Naxos (Naxos Island National Airport) and Lublin (Lublin Airport) is 986 miles / 1587 kilometers / 857 nautical miles.
The driving distance from Naxos (JNX) to Lublin (LUZ) is 1506 miles / 2424 kilometers, and travel time by car is about 43 hours 5 minutes.
Naxos Island National Airport – Lublin Airport
Search flights
Distance from Naxos to Lublin
There are several ways to calculate the distance from Naxos to Lublin. Here are two standard methods:
Vincenty's formula (applied above)- 986.296 miles
- 1587.289 kilometers
- 857.068 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 986.942 miles
- 1588.329 kilometers
- 857.629 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Naxos to Lublin?
The estimated flight time from Naxos Island National Airport to Lublin Airport is 2 hours and 22 minutes.
What is the time difference between Naxos and Lublin?
The time difference between Naxos and Lublin is 1 hour. Lublin is 1 hour behind Naxos.
Flight carbon footprint between Naxos Island National Airport (JNX) and Lublin Airport (LUZ)
On average, flying from Naxos to Lublin generates about 150 kg of CO2 per passenger, and 150 kilograms equals 330 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Naxos to Lublin
See the map of the shortest flight path between Naxos Island National Airport (JNX) and Lublin Airport (LUZ).
Airport information
Origin | Naxos Island National Airport |
---|---|
City: | Naxos |
Country: | Greece |
IATA Code: | JNX |
ICAO Code: | LGNX |
Coordinates: | 37°4′51″N, 25°22′5″E |
Destination | Lublin Airport |
---|---|
City: | Lublin |
Country: | Poland |
IATA Code: | LUZ |
ICAO Code: | EPLB |
Coordinates: | 51°14′25″N, 22°42′48″E |