Air Miles Calculator logo

How far is Uranium City from Santo Domingo?

The distance between Santo Domingo (Las Américas International Airport) and Uranium City (Uranium City Airport) is 3431 miles / 5522 kilometers / 2982 nautical miles.

Las Américas International Airport – Uranium City Airport

Distance arrow
3431
Miles
Distance arrow
5522
Kilometers
Distance arrow
2982
Nautical miles

Search flights

Distance from Santo Domingo to Uranium City

There are several ways to calculate the distance from Santo Domingo to Uranium City. Here are two standard methods:

Vincenty's formula (applied above)
  • 3431.401 miles
  • 5522.304 kilometers
  • 2981.806 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3432.443 miles
  • 5523.981 kilometers
  • 2982.711 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Santo Domingo to Uranium City?

The estimated flight time from Las Américas International Airport to Uranium City Airport is 6 hours and 59 minutes.

Flight carbon footprint between Las Américas International Airport (SDQ) and Uranium City Airport (YBE)

On average, flying from Santo Domingo to Uranium City generates about 386 kg of CO2 per passenger, and 386 kilograms equals 852 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Santo Domingo to Uranium City

See the map of the shortest flight path between Las Américas International Airport (SDQ) and Uranium City Airport (YBE).

Airport information

Origin Las Américas International Airport
City: Santo Domingo
Country: Dominican Republic Flag of Dominican Republic
IATA Code: SDQ
ICAO Code: MDSD
Coordinates: 18°25′46″N, 69°40′8″W
Destination Uranium City Airport
City: Uranium City
Country: Canada Flag of Canada
IATA Code: YBE
ICAO Code: CYBE
Coordinates: 59°33′41″N, 108°28′51″W