Air Miles Calculator logo

How far is Yibin from Jharsuguda?

The distance between Jharsuguda (Jharsuguda Airport) and Yibin (Yibin Wuliangye Airport) is 1364 miles / 2195 kilometers / 1185 nautical miles.

The driving distance from Jharsuguda (JRG) to Yibin (YBP) is 2285 miles / 3678 kilometers, and travel time by car is about 43 hours 40 minutes.

Jharsuguda Airport – Yibin Wuliangye Airport

Distance arrow
1364
Miles
Distance arrow
2195
Kilometers
Distance arrow
1185
Nautical miles
Flight time duration
3 h 4 min
Time Difference
2 h 30 min
CO2 emission
171 kg

Search flights

Distance from Jharsuguda to Yibin

There are several ways to calculate the distance from Jharsuguda to Yibin. Here are two standard methods:

Vincenty's formula (applied above)
  • 1364.193 miles
  • 2195.456 kilometers
  • 1185.451 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1362.749 miles
  • 2193.132 kilometers
  • 1184.197 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jharsuguda to Yibin?

The estimated flight time from Jharsuguda Airport to Yibin Wuliangye Airport is 3 hours and 4 minutes.

Flight carbon footprint between Jharsuguda Airport (JRG) and Yibin Wuliangye Airport (YBP)

On average, flying from Jharsuguda to Yibin generates about 171 kg of CO2 per passenger, and 171 kilograms equals 377 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jharsuguda to Yibin

See the map of the shortest flight path between Jharsuguda Airport (JRG) and Yibin Wuliangye Airport (YBP).

Airport information

Origin Jharsuguda Airport
City: Jharsuguda
Country: India Flag of India
IATA Code: JRG
ICAO Code: VEJH
Coordinates: 21°54′48″N, 84°3′1″E
Destination Yibin Wuliangye Airport
City: Yibin
Country: China Flag of China
IATA Code: YBP
ICAO Code: ZUYB
Coordinates: 28°51′28″N, 104°31′30″E