Air Miles Calculator logo

How far is Baise from San Luis Potosi?

The distance between San Luis Potosi (San Luis Potosí International Airport) and Baise (Baise Bama Airport) is 8760 miles / 14097 kilometers / 7612 nautical miles.

San Luis Potosí International Airport – Baise Bama Airport

Distance arrow
8760
Miles
Distance arrow
14097
Kilometers
Distance arrow
7612
Nautical miles
Flight time duration
17 h 5 min
CO2 emission
1 111 kg

Search flights

Distance from San Luis Potosi to Baise

There are several ways to calculate the distance from San Luis Potosi to Baise. Here are two standard methods:

Vincenty's formula (applied above)
  • 8759.576 miles
  • 14097.171 kilometers
  • 7611.863 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8748.819 miles
  • 14079.859 kilometers
  • 7602.516 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from San Luis Potosi to Baise?

The estimated flight time from San Luis Potosí International Airport to Baise Bama Airport is 17 hours and 5 minutes.

Flight carbon footprint between San Luis Potosí International Airport (SLP) and Baise Bama Airport (AEB)

On average, flying from San Luis Potosi to Baise generates about 1 111 kg of CO2 per passenger, and 1 111 kilograms equals 2 449 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from San Luis Potosi to Baise

See the map of the shortest flight path between San Luis Potosí International Airport (SLP) and Baise Bama Airport (AEB).

Airport information

Origin San Luis Potosí International Airport
City: San Luis Potosi
Country: Mexico Flag of Mexico
IATA Code: SLP
ICAO Code: MMSP
Coordinates: 22°15′15″N, 100°55′51″W
Destination Baise Bama Airport
City: Baise
Country: China Flag of China
IATA Code: AEB
ICAO Code: ZGBS
Coordinates: 23°43′14″N, 106°57′35″E