How far is London from Windsor Locks, CT?
The distance between Windsor Locks (Bradley International Airport) and London (London International Airport) is 439 miles / 707 kilometers / 382 nautical miles.
The driving distance from Windsor Locks (BDL) to London (YXU) is 519 miles / 835 kilometers, and travel time by car is about 10 hours 34 minutes.
Bradley International Airport – London International Airport
Search flights
Distance from Windsor Locks to London
There are several ways to calculate the distance from Windsor Locks to London. Here are two standard methods:
Vincenty's formula (applied above)- 439.109 miles
- 706.678 kilometers
- 381.575 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 437.996 miles
- 704.886 kilometers
- 380.608 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Windsor Locks to London?
The estimated flight time from Bradley International Airport to London International Airport is 1 hour and 19 minutes.
What is the time difference between Windsor Locks and London?
There is no time difference between Windsor Locks and London.
Flight carbon footprint between Bradley International Airport (BDL) and London International Airport (YXU)
On average, flying from Windsor Locks to London generates about 90 kg of CO2 per passenger, and 90 kilograms equals 198 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Windsor Locks to London
See the map of the shortest flight path between Bradley International Airport (BDL) and London International Airport (YXU).
Airport information
Origin | Bradley International Airport |
---|---|
City: | Windsor Locks, CT |
Country: | United States |
IATA Code: | BDL |
ICAO Code: | KBDL |
Coordinates: | 41°56′20″N, 72°40′59″W |
Destination | London International Airport |
---|---|
City: | London |
Country: | Canada |
IATA Code: | YXU |
ICAO Code: | CYXU |
Coordinates: | 43°2′8″N, 81°9′14″W |