Air Miles Calculator logo

How far is Yantai from Jharsuguda?

The distance between Jharsuguda (Jharsuguda Airport) and Yantai (Yantai Penglai International Airport) is 2449 miles / 3941 kilometers / 2128 nautical miles.

The driving distance from Jharsuguda (JRG) to Yantai (YNT) is 3413 miles / 5493 kilometers, and travel time by car is about 63 hours 51 minutes.

Jharsuguda Airport – Yantai Penglai International Airport

Distance arrow
2449
Miles
Distance arrow
3941
Kilometers
Distance arrow
2128
Nautical miles
Flight time duration
5 h 8 min
Time Difference
2 h 30 min
CO2 emission
269 kg

Search flights

Distance from Jharsuguda to Yantai

There are several ways to calculate the distance from Jharsuguda to Yantai. Here are two standard methods:

Vincenty's formula (applied above)
  • 2448.609 miles
  • 3940.654 kilometers
  • 2127.783 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2446.275 miles
  • 3936.899 kilometers
  • 2125.755 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jharsuguda to Yantai?

The estimated flight time from Jharsuguda Airport to Yantai Penglai International Airport is 5 hours and 8 minutes.

Flight carbon footprint between Jharsuguda Airport (JRG) and Yantai Penglai International Airport (YNT)

On average, flying from Jharsuguda to Yantai generates about 269 kg of CO2 per passenger, and 269 kilograms equals 593 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jharsuguda to Yantai

See the map of the shortest flight path between Jharsuguda Airport (JRG) and Yantai Penglai International Airport (YNT).

Airport information

Origin Jharsuguda Airport
City: Jharsuguda
Country: India Flag of India
IATA Code: JRG
ICAO Code: VEJH
Coordinates: 21°54′48″N, 84°3′1″E
Destination Yantai Penglai International Airport
City: Yantai
Country: China Flag of China
IATA Code: YNT
ICAO Code: ZSYT
Coordinates: 37°39′25″N, 120°59′13″E