Air Miles Calculator logo

How far is Wanxian from Jorhat?

The distance between Jorhat (Jorhat Airport) and Wanxian (Wanzhou Wuqiao Airport) is 908 miles / 1461 kilometers / 789 nautical miles.

The driving distance from Jorhat (JRH) to Wanxian (WXN) is 1611 miles / 2592 kilometers, and travel time by car is about 32 hours 35 minutes.

Jorhat Airport – Wanzhou Wuqiao Airport

Distance arrow
908
Miles
Distance arrow
1461
Kilometers
Distance arrow
789
Nautical miles
Flight time duration
2 h 13 min
Time Difference
2 h 30 min
CO2 emission
144 kg

Search flights

Distance from Jorhat to Wanxian

There are several ways to calculate the distance from Jorhat to Wanxian. Here are two standard methods:

Vincenty's formula (applied above)
  • 907.654 miles
  • 1460.727 kilometers
  • 788.729 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 906.391 miles
  • 1458.695 kilometers
  • 787.632 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jorhat to Wanxian?

The estimated flight time from Jorhat Airport to Wanzhou Wuqiao Airport is 2 hours and 13 minutes.

Flight carbon footprint between Jorhat Airport (JRH) and Wanzhou Wuqiao Airport (WXN)

On average, flying from Jorhat to Wanxian generates about 144 kg of CO2 per passenger, and 144 kilograms equals 318 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jorhat to Wanxian

See the map of the shortest flight path between Jorhat Airport (JRH) and Wanzhou Wuqiao Airport (WXN).

Airport information

Origin Jorhat Airport
City: Jorhat
Country: India Flag of India
IATA Code: JRH
ICAO Code: VEJT
Coordinates: 26°43′53″N, 94°10′31″E
Destination Wanzhou Wuqiao Airport
City: Wanxian
Country: China Flag of China
IATA Code: WXN
ICAO Code: ZUWX
Coordinates: 30°50′9″N, 108°24′21″E