How far is Brochet from Ushuaia?
The distance between Ushuaia (Ushuaia – Malvinas Argentinas International Airport) and Brochet (Brochet Airport) is 7987 miles / 12854 kilometers / 6941 nautical miles.
Ushuaia – Malvinas Argentinas International Airport – Brochet Airport
Search flights
Distance from Ushuaia to Brochet
There are several ways to calculate the distance from Ushuaia to Brochet. Here are two standard methods:
Vincenty's formula (applied above)- 7987.217 miles
- 12854.180 kilometers
- 6940.702 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 8008.502 miles
- 12888.435 kilometers
- 6959.198 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Ushuaia to Brochet?
The estimated flight time from Ushuaia – Malvinas Argentinas International Airport to Brochet Airport is 15 hours and 37 minutes.
What is the time difference between Ushuaia and Brochet?
The time difference between Ushuaia and Brochet is 3 hours. Brochet is 3 hours behind Ushuaia.
Flight carbon footprint between Ushuaia – Malvinas Argentinas International Airport (USH) and Brochet Airport (YBT)
On average, flying from Ushuaia to Brochet generates about 997 kg of CO2 per passenger, and 997 kilograms equals 2 197 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Ushuaia to Brochet
See the map of the shortest flight path between Ushuaia – Malvinas Argentinas International Airport (USH) and Brochet Airport (YBT).
Airport information
Origin | Ushuaia – Malvinas Argentinas International Airport |
---|---|
City: | Ushuaia |
Country: | Argentina |
IATA Code: | USH |
ICAO Code: | SAWH |
Coordinates: | 54°50′35″S, 68°17′44″W |
Destination | Brochet Airport |
---|---|
City: | Brochet |
Country: | Canada |
IATA Code: | YBT |
ICAO Code: | CYBT |
Coordinates: | 57°53′21″N, 101°40′44″W |