How far is Terrace from Napoli?
The distance between Napoli (Naples International Airport) and Terrace (Northwest Regional Airport Terrace-Kitimat) is 5507 miles / 8863 kilometers / 4786 nautical miles.
Naples International Airport – Northwest Regional Airport Terrace-Kitimat
Search flights
Distance from Napoli to Terrace
There are several ways to calculate the distance from Napoli to Terrace. Here are two standard methods:
Vincenty's formula (applied above)- 5507.417 miles
- 8863.328 kilometers
- 4785.814 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 5492.247 miles
- 8838.915 kilometers
- 4772.632 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Napoli to Terrace?
The estimated flight time from Naples International Airport to Northwest Regional Airport Terrace-Kitimat is 10 hours and 55 minutes.
What is the time difference between Napoli and Terrace?
The time difference between Napoli and Terrace is 9 hours. Terrace is 9 hours behind Napoli.
Flight carbon footprint between Naples International Airport (NAP) and Northwest Regional Airport Terrace-Kitimat (YXT)
On average, flying from Napoli to Terrace generates about 651 kg of CO2 per passenger, and 651 kilograms equals 1 434 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Napoli to Terrace
See the map of the shortest flight path between Naples International Airport (NAP) and Northwest Regional Airport Terrace-Kitimat (YXT).
Airport information
Origin | Naples International Airport |
---|---|
City: | Napoli |
Country: | Italy |
IATA Code: | NAP |
ICAO Code: | LIRN |
Coordinates: | 40°53′9″N, 14°17′26″E |
Destination | Northwest Regional Airport Terrace-Kitimat |
---|---|
City: | Terrace |
Country: | Canada |
IATA Code: | YXT |
ICAO Code: | CYXT |
Coordinates: | 54°28′6″N, 128°34′33″W |