How far is Uranium City from Fargo, ND?
The distance between Fargo (Hector International Airport) and Uranium City (Uranium City Airport) is 996 miles / 1603 kilometers / 865 nautical miles.
The driving distance from Fargo (FAR) to Uranium City (YBE) is 1419 miles / 2283 kilometers, and travel time by car is about 38 hours 51 minutes.
Hector International Airport – Uranium City Airport
Search flights
Distance from Fargo to Uranium City
There are several ways to calculate the distance from Fargo to Uranium City. Here are two standard methods:
Vincenty's formula (applied above)- 995.835 miles
- 1602.641 kilometers
- 865.357 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 994.443 miles
- 1600.400 kilometers
- 864.147 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Fargo to Uranium City?
The estimated flight time from Hector International Airport to Uranium City Airport is 2 hours and 23 minutes.
What is the time difference between Fargo and Uranium City?
Flight carbon footprint between Hector International Airport (FAR) and Uranium City Airport (YBE)
On average, flying from Fargo to Uranium City generates about 151 kg of CO2 per passenger, and 151 kilograms equals 332 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Fargo to Uranium City
See the map of the shortest flight path between Hector International Airport (FAR) and Uranium City Airport (YBE).
Airport information
Origin | Hector International Airport |
---|---|
City: | Fargo, ND |
Country: | United States |
IATA Code: | FAR |
ICAO Code: | KFAR |
Coordinates: | 46°55′14″N, 96°48′56″W |
Destination | Uranium City Airport |
---|---|
City: | Uranium City |
Country: | Canada |
IATA Code: | YBE |
ICAO Code: | CYBE |
Coordinates: | 59°33′41″N, 108°28′51″W |