How far is Fort St.John from Ushuaia?
The distance between Ushuaia (Ushuaia – Malvinas Argentinas International Airport) and Fort St.John (Fort St. John Airport) is 8202 miles / 13200 kilometers / 7127 nautical miles.
Ushuaia – Malvinas Argentinas International Airport – Fort St. John Airport
Search flights
Distance from Ushuaia to Fort St.John
There are several ways to calculate the distance from Ushuaia to Fort St.John. Here are two standard methods:
Vincenty's formula (applied above)- 8201.975 miles
- 13199.800 kilometers
- 7127.322 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 8221.372 miles
- 13231.015 kilometers
- 7144.177 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Ushuaia to Fort St.John?
The estimated flight time from Ushuaia – Malvinas Argentinas International Airport to Fort St. John Airport is 16 hours and 1 minutes.
What is the time difference between Ushuaia and Fort St.John?
Flight carbon footprint between Ushuaia – Malvinas Argentinas International Airport (USH) and Fort St. John Airport (YXJ)
On average, flying from Ushuaia to Fort St.John generates about 1 028 kg of CO2 per passenger, and 1 028 kilograms equals 2 266 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Ushuaia to Fort St.John
See the map of the shortest flight path between Ushuaia – Malvinas Argentinas International Airport (USH) and Fort St. John Airport (YXJ).
Airport information
Origin | Ushuaia – Malvinas Argentinas International Airport |
---|---|
City: | Ushuaia |
Country: | Argentina |
IATA Code: | USH |
ICAO Code: | SAWH |
Coordinates: | 54°50′35″S, 68°17′44″W |
Destination | Fort St. John Airport |
---|---|
City: | Fort St.John |
Country: | Canada |
IATA Code: | YXJ |
ICAO Code: | CYXJ |
Coordinates: | 56°14′17″N, 120°44′23″W |