Air Miles Calculator logo

How far is Hubli from Jeju?

The distance between Jeju (Jeju International Airport) and Hubli (Hubli Airport) is 3432 miles / 5524 kilometers / 2982 nautical miles.

The driving distance from Jeju (CJU) to Hubli (HBX) is 5164 miles / 8311 kilometers, and travel time by car is about 97 hours 56 minutes.

Jeju International Airport – Hubli Airport

Distance arrow
3432
Miles
Distance arrow
5524
Kilometers
Distance arrow
2982
Nautical miles
Flight time duration
6 h 59 min
Time Difference
3 h 30 min
CO2 emission
386 kg

Search flights

Distance from Jeju to Hubli

There are several ways to calculate the distance from Jeju to Hubli. Here are two standard methods:

Vincenty's formula (applied above)
  • 3432.198 miles
  • 5523.587 kilometers
  • 2982.498 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3428.898 miles
  • 5518.276 kilometers
  • 2979.631 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jeju to Hubli?

The estimated flight time from Jeju International Airport to Hubli Airport is 6 hours and 59 minutes.

Flight carbon footprint between Jeju International Airport (CJU) and Hubli Airport (HBX)

On average, flying from Jeju to Hubli generates about 386 kg of CO2 per passenger, and 386 kilograms equals 852 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jeju to Hubli

See the map of the shortest flight path between Jeju International Airport (CJU) and Hubli Airport (HBX).

Airport information

Origin Jeju International Airport
City: Jeju
Country: South Korea Flag of South Korea
IATA Code: CJU
ICAO Code: RKPC
Coordinates: 33°30′40″N, 126°29′34″E
Destination Hubli Airport
City: Hubli
Country: India Flag of India
IATA Code: HBX
ICAO Code: VAHB
Coordinates: 15°21′42″N, 75°5′5″E