How far is Lopez, WA, from Joplin, MO?
The distance between Joplin (Joplin Regional Airport) and Lopez (Lopez Island Airport) is 1630 miles / 2623 kilometers / 1416 nautical miles.
The driving distance from Joplin (JLN) to Lopez (LPS) is 2086 miles / 3357 kilometers, and travel time by car is about 37 hours 26 minutes.
Joplin Regional Airport – Lopez Island Airport
Search flights
Distance from Joplin to Lopez
There are several ways to calculate the distance from Joplin to Lopez. Here are two standard methods:
Vincenty's formula (applied above)- 1629.754 miles
- 2622.835 kilometers
- 1416.218 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1626.784 miles
- 2618.055 kilometers
- 1413.637 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Joplin to Lopez?
The estimated flight time from Joplin Regional Airport to Lopez Island Airport is 3 hours and 35 minutes.
What is the time difference between Joplin and Lopez?
The time difference between Joplin and Lopez is 2 hours. Lopez is 2 hours behind Joplin.
Flight carbon footprint between Joplin Regional Airport (JLN) and Lopez Island Airport (LPS)
On average, flying from Joplin to Lopez generates about 188 kg of CO2 per passenger, and 188 kilograms equals 414 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Joplin to Lopez
See the map of the shortest flight path between Joplin Regional Airport (JLN) and Lopez Island Airport (LPS).
Airport information
Origin | Joplin Regional Airport |
---|---|
City: | Joplin, MO |
Country: | United States |
IATA Code: | JLN |
ICAO Code: | KJLN |
Coordinates: | 37°9′6″N, 94°29′53″W |
Destination | Lopez Island Airport |
---|---|
City: | Lopez, WA |
Country: | United States |
IATA Code: | LPS |
ICAO Code: | S31 |
Coordinates: | 48°29′2″N, 122°56′16″W |