Air Miles Calculator logo

How far is Qian Gorlos Mongol Autonomous County from Anapa?

The distance between Anapa (Anapa Airport) and Qian Gorlos Mongol Autonomous County (Songyuan Chaganhu Airport) is 4047 miles / 6512 kilometers / 3516 nautical miles.

Anapa Airport – Songyuan Chaganhu Airport

Distance arrow
4047
Miles
Distance arrow
6512
Kilometers
Distance arrow
3516
Nautical miles

Search flights

Distance from Anapa to Qian Gorlos Mongol Autonomous County

There are several ways to calculate the distance from Anapa to Qian Gorlos Mongol Autonomous County. Here are two standard methods:

Vincenty's formula (applied above)
  • 4046.650 miles
  • 6512.451 kilometers
  • 3516.442 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4035.440 miles
  • 6494.411 kilometers
  • 3506.701 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Anapa to Qian Gorlos Mongol Autonomous County?

The estimated flight time from Anapa Airport to Songyuan Chaganhu Airport is 8 hours and 9 minutes.

Flight carbon footprint between Anapa Airport (AAQ) and Songyuan Chaganhu Airport (YSQ)

On average, flying from Anapa to Qian Gorlos Mongol Autonomous County generates about 462 kg of CO2 per passenger, and 462 kilograms equals 1 019 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Anapa to Qian Gorlos Mongol Autonomous County

See the map of the shortest flight path between Anapa Airport (AAQ) and Songyuan Chaganhu Airport (YSQ).

Airport information

Origin Anapa Airport
City: Anapa
Country: Russia Flag of Russia
IATA Code: AAQ
ICAO Code: URKA
Coordinates: 45°0′7″N, 37°20′50″E
Destination Songyuan Chaganhu Airport
City: Qian Gorlos Mongol Autonomous County
Country: China Flag of China
IATA Code: YSQ
ICAO Code: ZYSQ
Coordinates: 44°56′17″N, 124°33′0″E