How far is Latrobe, PA, from Nain?
The distance between Nain (Nain Airport) and Latrobe (Arnold Palmer Regional Airport) is 1380 miles / 2221 kilometers / 1199 nautical miles.
The driving distance from Nain (YDP) to Latrobe (LBE) is 2247 miles / 3616 kilometers, and travel time by car is about 73 hours 43 minutes.
Nain Airport – Arnold Palmer Regional Airport
Search flights
Distance from Nain to Latrobe
There are several ways to calculate the distance from Nain to Latrobe. Here are two standard methods:
Vincenty's formula (applied above)- 1380.010 miles
- 2220.911 kilometers
- 1199.196 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1378.618 miles
- 2218.670 kilometers
- 1197.986 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Nain to Latrobe?
The estimated flight time from Nain Airport to Arnold Palmer Regional Airport is 3 hours and 6 minutes.
What is the time difference between Nain and Latrobe?
The time difference between Nain and Latrobe is 1 hour. Latrobe is 1 hour behind Nain.
Flight carbon footprint between Nain Airport (YDP) and Arnold Palmer Regional Airport (LBE)
On average, flying from Nain to Latrobe generates about 172 kg of CO2 per passenger, and 172 kilograms equals 379 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Nain to Latrobe
See the map of the shortest flight path between Nain Airport (YDP) and Arnold Palmer Regional Airport (LBE).
Airport information
Origin | Nain Airport |
---|---|
City: | Nain |
Country: | Canada |
IATA Code: | YDP |
ICAO Code: | CYDP |
Coordinates: | 56°32′57″N, 61°40′49″W |
Destination | Arnold Palmer Regional Airport |
---|---|
City: | Latrobe, PA |
Country: | United States |
IATA Code: | LBE |
ICAO Code: | KLBE |
Coordinates: | 40°16′33″N, 79°24′17″W |