How far is Uranium City from Tulsa, OK?
The distance between Tulsa (Tulsa International Airport) and Uranium City (Uranium City Airport) is 1710 miles / 2752 kilometers / 1486 nautical miles.
The driving distance from Tulsa (TUL) to Uranium City (YBE) is 2215 miles / 3565 kilometers, and travel time by car is about 53 hours 32 minutes.
Tulsa International Airport – Uranium City Airport
Search flights
Distance from Tulsa to Uranium City
There are several ways to calculate the distance from Tulsa to Uranium City. Here are two standard methods:
Vincenty's formula (applied above)- 1710.176 miles
- 2752.261 kilometers
- 1486.102 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1709.751 miles
- 2751.578 kilometers
- 1485.733 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Tulsa to Uranium City?
The estimated flight time from Tulsa International Airport to Uranium City Airport is 3 hours and 44 minutes.
What is the time difference between Tulsa and Uranium City?
Flight carbon footprint between Tulsa International Airport (TUL) and Uranium City Airport (YBE)
On average, flying from Tulsa to Uranium City generates about 193 kg of CO2 per passenger, and 193 kilograms equals 426 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Tulsa to Uranium City
See the map of the shortest flight path between Tulsa International Airport (TUL) and Uranium City Airport (YBE).
Airport information
Origin | Tulsa International Airport |
---|---|
City: | Tulsa, OK |
Country: | United States |
IATA Code: | TUL |
ICAO Code: | KTUL |
Coordinates: | 36°11′54″N, 95°53′17″W |
Destination | Uranium City Airport |
---|---|
City: | Uranium City |
Country: | Canada |
IATA Code: | YBE |
ICAO Code: | CYBE |
Coordinates: | 59°33′41″N, 108°28′51″W |