Air Miles Calculator logo

How far is Qingyang from Gaya?

The distance between Gaya (Gaya Airport) and Qingyang (Qingyang Xifeng Airport) is 1548 miles / 2491 kilometers / 1345 nautical miles.

The driving distance from Gaya (GAY) to Qingyang (IQN) is 2353 miles / 3786 kilometers, and travel time by car is about 44 hours 13 minutes.

Gaya Airport – Qingyang Xifeng Airport

Distance arrow
1548
Miles
Distance arrow
2491
Kilometers
Distance arrow
1345
Nautical miles
Flight time duration
3 h 25 min
Time Difference
2 h 30 min
CO2 emission
183 kg

Search flights

Distance from Gaya to Qingyang

There are several ways to calculate the distance from Gaya to Qingyang. Here are two standard methods:

Vincenty's formula (applied above)
  • 1548.063 miles
  • 2491.366 kilometers
  • 1345.230 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1546.895 miles
  • 2489.486 kilometers
  • 1344.215 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Gaya to Qingyang?

The estimated flight time from Gaya Airport to Qingyang Xifeng Airport is 3 hours and 25 minutes.

Flight carbon footprint between Gaya Airport (GAY) and Qingyang Xifeng Airport (IQN)

On average, flying from Gaya to Qingyang generates about 183 kg of CO2 per passenger, and 183 kilograms equals 402 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Gaya to Qingyang

See the map of the shortest flight path between Gaya Airport (GAY) and Qingyang Xifeng Airport (IQN).

Airport information

Origin Gaya Airport
City: Gaya
Country: India Flag of India
IATA Code: GAY
ICAO Code: VEGY
Coordinates: 24°44′39″N, 84°57′4″E
Destination Qingyang Xifeng Airport
City: Qingyang
Country: China Flag of China
IATA Code: IQN
ICAO Code: ZLQY
Coordinates: 35°47′58″N, 107°36′10″E