How far is Windsor from St John's?
The distance between St John's (V. C. Bird International Airport) and Windsor (Windsor International Airport) is 2136 miles / 3437 kilometers / 1856 nautical miles.
V. C. Bird International Airport – Windsor International Airport
Search flights
Distance from St John's to Windsor
There are several ways to calculate the distance from St John's to Windsor. Here are two standard methods:
Vincenty's formula (applied above)- 2135.738 miles
- 3437.137 kilometers
- 1855.905 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2138.612 miles
- 3441.762 kilometers
- 1858.403 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from St John's to Windsor?
The estimated flight time from V. C. Bird International Airport to Windsor International Airport is 4 hours and 32 minutes.
What is the time difference between St John's and Windsor?
The time difference between St John's and Windsor is 1 hour. Windsor is 1 hour behind St John's.
Flight carbon footprint between V. C. Bird International Airport (ANU) and Windsor International Airport (YQG)
On average, flying from St John's to Windsor generates about 233 kg of CO2 per passenger, and 233 kilograms equals 514 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from St John's to Windsor
See the map of the shortest flight path between V. C. Bird International Airport (ANU) and Windsor International Airport (YQG).
Airport information
Origin | V. C. Bird International Airport |
---|---|
City: | St John's |
Country: | Antigua and Barbuda |
IATA Code: | ANU |
ICAO Code: | TAPA |
Coordinates: | 17°8′12″N, 61°47′33″W |
Destination | Windsor International Airport |
---|---|
City: | Windsor |
Country: | Canada |
IATA Code: | YQG |
ICAO Code: | CYQG |
Coordinates: | 42°16′32″N, 82°57′20″W |