Air Miles Calculator logo

How far is London from Meridian, MS?

The distance between Meridian (Meridian Regional Airport) and London (London International Airport) is 847 miles / 1363 kilometers / 736 nautical miles.

The driving distance from Meridian (MEI) to London (YXU) is 996 miles / 1603 kilometers, and travel time by car is about 18 hours 51 minutes.

Meridian Regional Airport – London International Airport

Distance arrow
847
Miles
Distance arrow
1363
Kilometers
Distance arrow
736
Nautical miles

Search flights

Distance from Meridian to London

There are several ways to calculate the distance from Meridian to London. Here are two standard methods:

Vincenty's formula (applied above)
  • 846.633 miles
  • 1362.524 kilometers
  • 735.704 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 847.336 miles
  • 1363.655 kilometers
  • 736.315 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Meridian to London?

The estimated flight time from Meridian Regional Airport to London International Airport is 2 hours and 6 minutes.

Flight carbon footprint between Meridian Regional Airport (MEI) and London International Airport (YXU)

On average, flying from Meridian to London generates about 139 kg of CO2 per passenger, and 139 kilograms equals 307 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Meridian to London

See the map of the shortest flight path between Meridian Regional Airport (MEI) and London International Airport (YXU).

Airport information

Origin Meridian Regional Airport
City: Meridian, MS
Country: United States Flag of United States
IATA Code: MEI
ICAO Code: KMEI
Coordinates: 32°19′57″N, 88°45′6″W
Destination London International Airport
City: London
Country: Canada Flag of Canada
IATA Code: YXU
ICAO Code: CYXU
Coordinates: 43°2′8″N, 81°9′14″W