Air Miles Calculator logo

How far is Xiahe from Jamnagar?

The distance between Jamnagar (Jamnagar Airport) and Xiahe (Gannan Xiahe Airport) is 2145 miles / 3451 kilometers / 1864 nautical miles.

The driving distance from Jamnagar (JGA) to Xiahe (GXH) is 3031 miles / 4878 kilometers, and travel time by car is about 60 hours 8 minutes.

Jamnagar Airport – Gannan Xiahe Airport

Distance arrow
2145
Miles
Distance arrow
3451
Kilometers
Distance arrow
1864
Nautical miles
Flight time duration
4 h 33 min
Time Difference
2 h 30 min
CO2 emission
234 kg

Search flights

Distance from Jamnagar to Xiahe

There are several ways to calculate the distance from Jamnagar to Xiahe. Here are two standard methods:

Vincenty's formula (applied above)
  • 2144.624 miles
  • 3451.438 kilometers
  • 1863.627 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2142.322 miles
  • 3447.732 kilometers
  • 1861.627 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Jamnagar to Xiahe?

The estimated flight time from Jamnagar Airport to Gannan Xiahe Airport is 4 hours and 33 minutes.

Flight carbon footprint between Jamnagar Airport (JGA) and Gannan Xiahe Airport (GXH)

On average, flying from Jamnagar to Xiahe generates about 234 kg of CO2 per passenger, and 234 kilograms equals 516 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Jamnagar to Xiahe

See the map of the shortest flight path between Jamnagar Airport (JGA) and Gannan Xiahe Airport (GXH).

Airport information

Origin Jamnagar Airport
City: Jamnagar
Country: India Flag of India
IATA Code: JGA
ICAO Code: VAJM
Coordinates: 22°27′55″N, 70°0′45″E
Destination Gannan Xiahe Airport
City: Xiahe
Country: China Flag of China
IATA Code: GXH
ICAO Code: ZLXH
Coordinates: 34°48′37″N, 102°38′40″E