Air Miles Calculator logo

How far is Uranium City from Jakarta?

The distance between Jakarta (Soekarno–Hatta International Airport) and Uranium City (Uranium City Airport) is 8311 miles / 13375 kilometers / 7222 nautical miles.

Soekarno–Hatta International Airport – Uranium City Airport

Distance arrow
8311
Miles
Distance arrow
13375
Kilometers
Distance arrow
7222
Nautical miles
Flight time duration
16 h 14 min
CO2 emission
1 044 kg

Search flights

Distance from Jakarta to Uranium City

There are several ways to calculate the distance from Jakarta to Uranium City. Here are two standard methods:

Vincenty's formula (applied above)
  • 8310.688 miles
  • 13374.755 kilometers
  • 7221.790 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8309.215 miles
  • 13372.385 kilometers
  • 7220.510 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jakarta to Uranium City?

The estimated flight time from Soekarno–Hatta International Airport to Uranium City Airport is 16 hours and 14 minutes.

Flight carbon footprint between Soekarno–Hatta International Airport (CGK) and Uranium City Airport (YBE)

On average, flying from Jakarta to Uranium City generates about 1 044 kg of CO2 per passenger, and 1 044 kilograms equals 2 302 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Jakarta to Uranium City

See the map of the shortest flight path between Soekarno–Hatta International Airport (CGK) and Uranium City Airport (YBE).

Airport information

Origin Soekarno–Hatta International Airport
City: Jakarta
Country: Indonesia Flag of Indonesia
IATA Code: CGK
ICAO Code: WIII
Coordinates: 6°7′32″S, 106°39′21″E
Destination Uranium City Airport
City: Uranium City
Country: Canada Flag of Canada
IATA Code: YBE
ICAO Code: CYBE
Coordinates: 59°33′41″N, 108°28′51″W