Air Miles Calculator logo

How far is Jharsuguda from Lijiang?

The distance between Lijiang (Lijiang Sanyi International Airport) and Jharsuguda (Jharsuguda Airport) is 1072 miles / 1725 kilometers / 932 nautical miles.

The driving distance from Lijiang (LJG) to Jharsuguda (JRG) is 1798 miles / 2893 kilometers, and travel time by car is about 38 hours 9 minutes.

Lijiang Sanyi International Airport – Jharsuguda Airport

Distance arrow
1072
Miles
Distance arrow
1725
Kilometers
Distance arrow
932
Nautical miles
Flight time duration
2 h 31 min
Time Difference
2 h 30 min
CO2 emission
155 kg

Search flights

Distance from Lijiang to Jharsuguda

There are several ways to calculate the distance from Lijiang to Jharsuguda. Here are two standard methods:

Vincenty's formula (applied above)
  • 1071.966 miles
  • 1725.162 kilometers
  • 931.513 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1070.722 miles
  • 1723.160 kilometers
  • 930.432 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lijiang to Jharsuguda?

The estimated flight time from Lijiang Sanyi International Airport to Jharsuguda Airport is 2 hours and 31 minutes.

Flight carbon footprint between Lijiang Sanyi International Airport (LJG) and Jharsuguda Airport (JRG)

On average, flying from Lijiang to Jharsuguda generates about 155 kg of CO2 per passenger, and 155 kilograms equals 342 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Lijiang to Jharsuguda

See the map of the shortest flight path between Lijiang Sanyi International Airport (LJG) and Jharsuguda Airport (JRG).

Airport information

Origin Lijiang Sanyi International Airport
City: Lijiang
Country: China Flag of China
IATA Code: LJG
ICAO Code: ZPLJ
Coordinates: 26°40′45″N, 100°14′44″E
Destination Jharsuguda Airport
City: Jharsuguda
Country: India Flag of India
IATA Code: JRG
ICAO Code: VEJH
Coordinates: 21°54′48″N, 84°3′1″E