Air Miles Calculator logo

How far is Bazhong from Jorhat?

The distance between Jorhat (Jorhat Airport) and Bazhong (Bazhong Enyang Airport) is 828 miles / 1332 kilometers / 719 nautical miles.

The driving distance from Jorhat (JRH) to Bazhong (BZX) is 1516 miles / 2440 kilometers, and travel time by car is about 30 hours 55 minutes.

Jorhat Airport – Bazhong Enyang Airport

Distance arrow
828
Miles
Distance arrow
1332
Kilometers
Distance arrow
719
Nautical miles
Flight time duration
2 h 4 min
Time Difference
2 h 30 min
CO2 emission
137 kg

Search flights

Distance from Jorhat to Bazhong

There are several ways to calculate the distance from Jorhat to Bazhong. Here are two standard methods:

Vincenty's formula (applied above)
  • 827.689 miles
  • 1332.036 kilometers
  • 719.242 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 826.840 miles
  • 1330.671 kilometers
  • 718.505 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jorhat to Bazhong?

The estimated flight time from Jorhat Airport to Bazhong Enyang Airport is 2 hours and 4 minutes.

Flight carbon footprint between Jorhat Airport (JRH) and Bazhong Enyang Airport (BZX)

On average, flying from Jorhat to Bazhong generates about 137 kg of CO2 per passenger, and 137 kilograms equals 303 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jorhat to Bazhong

See the map of the shortest flight path between Jorhat Airport (JRH) and Bazhong Enyang Airport (BZX).

Airport information

Origin Jorhat Airport
City: Jorhat
Country: India Flag of India
IATA Code: JRH
ICAO Code: VEJT
Coordinates: 26°43′53″N, 94°10′31″E
Destination Bazhong Enyang Airport
City: Bazhong
Country: China Flag of China
IATA Code: BZX
ICAO Code: ZUBZ
Coordinates: 31°44′18″N, 106°38′41″E