How far is North Platte, NE, from Santorini?
The distance between Santorini (Santorini (Thira) International Airport) and North Platte (North Platte Regional Airport) is 6104 miles / 9823 kilometers / 5304 nautical miles.
Santorini (Thira) International Airport – North Platte Regional Airport
Search flights
Distance from Santorini to North Platte
There are several ways to calculate the distance from Santorini to North Platte. Here are two standard methods:
Vincenty's formula (applied above)- 6103.733 miles
- 9823.006 kilometers
- 5303.999 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 6089.632 miles
- 9800.312 kilometers
- 5291.745 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Santorini to North Platte?
The estimated flight time from Santorini (Thira) International Airport to North Platte Regional Airport is 12 hours and 3 minutes.
What is the time difference between Santorini and North Platte?
Flight carbon footprint between Santorini (Thira) International Airport (JTR) and North Platte Regional Airport (LBF)
On average, flying from Santorini to North Platte generates about 731 kg of CO2 per passenger, and 731 kilograms equals 1 611 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Santorini to North Platte
See the map of the shortest flight path between Santorini (Thira) International Airport (JTR) and North Platte Regional Airport (LBF).
Airport information
Origin | Santorini (Thira) International Airport |
---|---|
City: | Santorini |
Country: | Greece |
IATA Code: | JTR |
ICAO Code: | LGSR |
Coordinates: | 36°23′57″N, 25°28′45″E |
Destination | North Platte Regional Airport |
---|---|
City: | North Platte, NE |
Country: | United States |
IATA Code: | LBF |
ICAO Code: | KLBF |
Coordinates: | 41°7′34″N, 100°41′2″W |