How far is London from Fort Myers, FL?
The distance between Fort Myers (Southwest Florida International Airport) and London (London International Airport) is 1138 miles / 1831 kilometers / 989 nautical miles.
The driving distance from Fort Myers (RSW) to London (YXU) is 1424 miles / 2292 kilometers, and travel time by car is about 26 hours 46 minutes.
Southwest Florida International Airport – London International Airport
Search flights
Distance from Fort Myers to London
There are several ways to calculate the distance from Fort Myers to London. Here are two standard methods:
Vincenty's formula (applied above)- 1137.884 miles
- 1831.246 kilometers
- 988.794 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1140.502 miles
- 1835.460 kilometers
- 991.069 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Fort Myers to London?
The estimated flight time from Southwest Florida International Airport to London International Airport is 2 hours and 39 minutes.
What is the time difference between Fort Myers and London?
Flight carbon footprint between Southwest Florida International Airport (RSW) and London International Airport (YXU)
On average, flying from Fort Myers to London generates about 159 kg of CO2 per passenger, and 159 kilograms equals 350 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Fort Myers to London
See the map of the shortest flight path between Southwest Florida International Airport (RSW) and London International Airport (YXU).
Airport information
Origin | Southwest Florida International Airport |
---|---|
City: | Fort Myers, FL |
Country: | United States |
IATA Code: | RSW |
ICAO Code: | KRSW |
Coordinates: | 26°32′10″N, 81°45′18″W |
Destination | London International Airport |
---|---|
City: | London |
Country: | Canada |
IATA Code: | YXU |
ICAO Code: | CYXU |
Coordinates: | 43°2′8″N, 81°9′14″W |