Air Miles Calculator logo

How far is Beijing from Jharsuguda?

The distance between Jharsuguda (Jharsuguda Airport) and Beijing (Beijing Nanyuan Airport) is 2263 miles / 3642 kilometers / 1966 nautical miles.

The driving distance from Jharsuguda (JRG) to Beijing (NAY) is 3180 miles / 5117 kilometers, and travel time by car is about 59 hours 59 minutes.

Jharsuguda Airport – Beijing Nanyuan Airport

Distance arrow
2263
Miles
Distance arrow
3642
Kilometers
Distance arrow
1966
Nautical miles
Flight time duration
4 h 47 min
Time Difference
2 h 30 min
CO2 emission
248 kg

Search flights

Distance from Jharsuguda to Beijing

There are several ways to calculate the distance from Jharsuguda to Beijing. Here are two standard methods:

Vincenty's formula (applied above)
  • 2262.935 miles
  • 3641.842 kilometers
  • 1966.437 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2261.719 miles
  • 3639.885 kilometers
  • 1965.381 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jharsuguda to Beijing?

The estimated flight time from Jharsuguda Airport to Beijing Nanyuan Airport is 4 hours and 47 minutes.

Flight carbon footprint between Jharsuguda Airport (JRG) and Beijing Nanyuan Airport (NAY)

On average, flying from Jharsuguda to Beijing generates about 248 kg of CO2 per passenger, and 248 kilograms equals 546 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jharsuguda to Beijing

See the map of the shortest flight path between Jharsuguda Airport (JRG) and Beijing Nanyuan Airport (NAY).

Airport information

Origin Jharsuguda Airport
City: Jharsuguda
Country: India Flag of India
IATA Code: JRG
ICAO Code: VEJH
Coordinates: 21°54′48″N, 84°3′1″E
Destination Beijing Nanyuan Airport
City: Beijing
Country: China Flag of China
IATA Code: NAY
ICAO Code: ZBNY
Coordinates: 39°46′58″N, 116°23′16″E