How far is London from Saginaw, MI?
The distance between Saginaw (Saginaw MBS International Airport) and London (London International Airport) is 151 miles / 244 kilometers / 132 nautical miles.
The driving distance from Saginaw (MBS) to London (YXU) is 203 miles / 327 kilometers, and travel time by car is about 3 hours 55 minutes.
Saginaw MBS International Airport – London International Airport
Search flights
Distance from Saginaw to London
There are several ways to calculate the distance from Saginaw to London. Here are two standard methods:
Vincenty's formula (applied above)- 151.484 miles
- 243.790 kilometers
- 131.636 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 151.104 miles
- 243.178 kilometers
- 131.306 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Saginaw to London?
The estimated flight time from Saginaw MBS International Airport to London International Airport is 47 minutes.
What is the time difference between Saginaw and London?
Flight carbon footprint between Saginaw MBS International Airport (MBS) and London International Airport (YXU)
On average, flying from Saginaw to London generates about 47 kg of CO2 per passenger, and 47 kilograms equals 104 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Saginaw to London
See the map of the shortest flight path between Saginaw MBS International Airport (MBS) and London International Airport (YXU).
Airport information
Origin | Saginaw MBS International Airport |
---|---|
City: | Saginaw, MI |
Country: | United States |
IATA Code: | MBS |
ICAO Code: | KMBS |
Coordinates: | 43°31′58″N, 84°4′46″W |
Destination | London International Airport |
---|---|
City: | London |
Country: | Canada |
IATA Code: | YXU |
ICAO Code: | CYXU |
Coordinates: | 43°2′8″N, 81°9′14″W |