Air Miles Calculator logo

How far is Belgaum from Jixi?

The distance between Jixi (Jixi Xingkaihu Airport) and Belgaum (Belgaum Airport) is 3834 miles / 6171 kilometers / 3332 nautical miles.

The driving distance from Jixi (JXA) to Belgaum (IXG) is 5010 miles / 8063 kilometers, and travel time by car is about 93 hours 47 minutes.

Jixi Xingkaihu Airport – Belgaum Airport

Distance arrow
3834
Miles
Distance arrow
6171
Kilometers
Distance arrow
3332
Nautical miles
Flight time duration
7 h 45 min
Time Difference
2 h 30 min
CO2 emission
436 kg

Search flights

Distance from Jixi to Belgaum

There are several ways to calculate the distance from Jixi to Belgaum. Here are two standard methods:

Vincenty's formula (applied above)
  • 3834.244 miles
  • 6170.617 kilometers
  • 3331.867 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3831.914 miles
  • 6166.868 kilometers
  • 3329.842 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jixi to Belgaum?

The estimated flight time from Jixi Xingkaihu Airport to Belgaum Airport is 7 hours and 45 minutes.

Flight carbon footprint between Jixi Xingkaihu Airport (JXA) and Belgaum Airport (IXG)

On average, flying from Jixi to Belgaum generates about 436 kg of CO2 per passenger, and 436 kilograms equals 961 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jixi to Belgaum

See the map of the shortest flight path between Jixi Xingkaihu Airport (JXA) and Belgaum Airport (IXG).

Airport information

Origin Jixi Xingkaihu Airport
City: Jixi
Country: China Flag of China
IATA Code: JXA
ICAO Code: ZYJX
Coordinates: 45°17′34″N, 131°11′34″E
Destination Belgaum Airport
City: Belgaum
Country: India Flag of India
IATA Code: IXG
ICAO Code: VABM
Coordinates: 15°51′33″N, 74°37′5″E