How far is Tyler, TX, from Arctic Bay?
The distance between Arctic Bay (Arctic Bay Airport) and Tyler (Tyler Pounds Regional Airport) is 2835 miles / 4563 kilometers / 2464 nautical miles.
Arctic Bay Airport – Tyler Pounds Regional Airport
Search flights
Distance from Arctic Bay to Tyler
There are several ways to calculate the distance from Arctic Bay to Tyler. Here are two standard methods:
Vincenty's formula (applied above)- 2835.118 miles
- 4562.681 kilometers
- 2463.650 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2833.135 miles
- 4559.489 kilometers
- 2461.927 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Arctic Bay to Tyler?
The estimated flight time from Arctic Bay Airport to Tyler Pounds Regional Airport is 5 hours and 52 minutes.
What is the time difference between Arctic Bay and Tyler?
Flight carbon footprint between Arctic Bay Airport (YAB) and Tyler Pounds Regional Airport (TYR)
On average, flying from Arctic Bay to Tyler generates about 315 kg of CO2 per passenger, and 315 kilograms equals 694 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Arctic Bay to Tyler
See the map of the shortest flight path between Arctic Bay Airport (YAB) and Tyler Pounds Regional Airport (TYR).
Airport information
Origin | Arctic Bay Airport |
---|---|
City: | Arctic Bay |
Country: | Canada |
IATA Code: | YAB |
ICAO Code: | CYAB |
Coordinates: | 73°0′20″N, 85°2′33″W |
Destination | Tyler Pounds Regional Airport |
---|---|
City: | Tyler, TX |
Country: | United States |
IATA Code: | TYR |
ICAO Code: | KTYR |
Coordinates: | 32°21′14″N, 95°24′8″W |