Air Miles Calculator logo

How far is Baise from Sehwan Sharif?

The distance between Sehwan Sharif (Sehwan Sharif Airport) and Baise (Baise Bama Airport) is 2458 miles / 3955 kilometers / 2136 nautical miles.

The driving distance from Sehwan Sharif (SYW) to Baise (AEB) is 3467 miles / 5580 kilometers, and travel time by car is about 67 hours 41 minutes.

Sehwan Sharif Airport – Baise Bama Airport

Distance arrow
2458
Miles
Distance arrow
3955
Kilometers
Distance arrow
2136
Nautical miles

Search flights

Distance from Sehwan Sharif to Baise

There are several ways to calculate the distance from Sehwan Sharif to Baise. Here are two standard methods:

Vincenty's formula (applied above)
  • 2457.599 miles
  • 3955.122 kilometers
  • 2135.595 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2453.456 miles
  • 3948.455 kilometers
  • 2131.995 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Sehwan Sharif to Baise?

The estimated flight time from Sehwan Sharif Airport to Baise Bama Airport is 5 hours and 9 minutes.

Flight carbon footprint between Sehwan Sharif Airport (SYW) and Baise Bama Airport (AEB)

On average, flying from Sehwan Sharif to Baise generates about 270 kg of CO2 per passenger, and 270 kilograms equals 596 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Sehwan Sharif to Baise

See the map of the shortest flight path between Sehwan Sharif Airport (SYW) and Baise Bama Airport (AEB).

Airport information

Origin Sehwan Sharif Airport
City: Sehwan Sharif
Country: Pakistan Flag of Pakistan
IATA Code: SYW
ICAO Code: OPSN
Coordinates: 26°28′23″N, 67°43′1″E
Destination Baise Bama Airport
City: Baise
Country: China Flag of China
IATA Code: AEB
ICAO Code: ZGBS
Coordinates: 23°43′14″N, 106°57′35″E