Air Miles Calculator logo

How far is Santa Maria Island from Guangzhou?

The distance between Guangzhou (Guangzhou Baiyun International Airport) and Santa Maria Island (Santa Maria Airport) is 7479 miles / 12037 kilometers / 6499 nautical miles.

Guangzhou Baiyun International Airport – Santa Maria Airport

Distance arrow
7479
Miles
Distance arrow
12037
Kilometers
Distance arrow
6499
Nautical miles

Search flights

Distance from Guangzhou to Santa Maria Island

There are several ways to calculate the distance from Guangzhou to Santa Maria Island. Here are two standard methods:

Vincenty's formula (applied above)
  • 7479.288 miles
  • 12036.747 kilometers
  • 6499.324 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7466.807 miles
  • 12016.662 kilometers
  • 6488.478 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Guangzhou to Santa Maria Island?

The estimated flight time from Guangzhou Baiyun International Airport to Santa Maria Airport is 14 hours and 39 minutes.

Flight carbon footprint between Guangzhou Baiyun International Airport (CAN) and Santa Maria Airport (SMA)

On average, flying from Guangzhou to Santa Maria Island generates about 923 kg of CO2 per passenger, and 923 kilograms equals 2 035 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Guangzhou to Santa Maria Island

See the map of the shortest flight path between Guangzhou Baiyun International Airport (CAN) and Santa Maria Airport (SMA).

Airport information

Origin Guangzhou Baiyun International Airport
City: Guangzhou
Country: China Flag of China
IATA Code: CAN
ICAO Code: ZGGG
Coordinates: 23°23′32″N, 113°17′56″E
Destination Santa Maria Airport
City: Santa Maria Island
Country: Portugal Flag of Portugal
IATA Code: SMA
ICAO Code: LPAZ
Coordinates: 36°58′17″N, 25°10′14″W