Air Miles Calculator logo

How far is Baoshan from Camaguey?

The distance between Camaguey (Ignacio Agramonte International Airport) and Baoshan (Baoshan Yunrui Airport) is 9229 miles / 14852 kilometers / 8020 nautical miles.

Ignacio Agramonte International Airport – Baoshan Yunrui Airport

Distance arrow
9229
Miles
Distance arrow
14852
Kilometers
Distance arrow
8020
Nautical miles
Flight time duration
17 h 58 min
CO2 emission
1 182 kg

Search flights

Distance from Camaguey to Baoshan

There are several ways to calculate the distance from Camaguey to Baoshan. Here are two standard methods:

Vincenty's formula (applied above)
  • 9228.832 miles
  • 14852.365 kilometers
  • 8019.636 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9219.556 miles
  • 14837.437 kilometers
  • 8011.575 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Camaguey to Baoshan?

The estimated flight time from Ignacio Agramonte International Airport to Baoshan Yunrui Airport is 17 hours and 58 minutes.

Flight carbon footprint between Ignacio Agramonte International Airport (CMW) and Baoshan Yunrui Airport (BSD)

On average, flying from Camaguey to Baoshan generates about 1 182 kg of CO2 per passenger, and 1 182 kilograms equals 2 606 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Camaguey to Baoshan

See the map of the shortest flight path between Ignacio Agramonte International Airport (CMW) and Baoshan Yunrui Airport (BSD).

Airport information

Origin Ignacio Agramonte International Airport
City: Camaguey
Country: Cuba Flag of Cuba
IATA Code: CMW
ICAO Code: MUCM
Coordinates: 21°25′13″N, 77°50′51″W
Destination Baoshan Yunrui Airport
City: Baoshan
Country: China Flag of China
IATA Code: BSD
ICAO Code: ZPBS
Coordinates: 25°3′11″N, 99°10′5″E