Air Miles Calculator logo

How far is Uranium City from Bucharest?

The distance between Bucharest (Bucharest Henri Coandă International Airport) and Uranium City (Uranium City Airport) is 4810 miles / 7741 kilometers / 4180 nautical miles.

Bucharest Henri Coandă International Airport – Uranium City Airport

Distance arrow
4810
Miles
Distance arrow
7741
Kilometers
Distance arrow
4180
Nautical miles

Search flights

Distance from Bucharest to Uranium City

There are several ways to calculate the distance from Bucharest to Uranium City. Here are two standard methods:

Vincenty's formula (applied above)
  • 4809.914 miles
  • 7740.807 kilometers
  • 4179.701 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4795.468 miles
  • 7717.557 kilometers
  • 4167.148 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bucharest to Uranium City?

The estimated flight time from Bucharest Henri Coandă International Airport to Uranium City Airport is 9 hours and 36 minutes.

Flight carbon footprint between Bucharest Henri Coandă International Airport (OTP) and Uranium City Airport (YBE)

On average, flying from Bucharest to Uranium City generates about 559 kg of CO2 per passenger, and 559 kilograms equals 1 233 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Bucharest to Uranium City

See the map of the shortest flight path between Bucharest Henri Coandă International Airport (OTP) and Uranium City Airport (YBE).

Airport information

Origin Bucharest Henri Coandă International Airport
City: Bucharest
Country: Romania Flag of Romania
IATA Code: OTP
ICAO Code: LROP
Coordinates: 44°34′19″N, 26°6′7″E
Destination Uranium City Airport
City: Uranium City
Country: Canada Flag of Canada
IATA Code: YBE
ICAO Code: CYBE
Coordinates: 59°33′41″N, 108°28′51″W