Air Miles Calculator logo

How far is Baise from Houston, TX?

The distance between Houston (Houston George Bush Intercontinental Airport) and Baise (Baise Bama Airport) is 8454 miles / 13605 kilometers / 7346 nautical miles.

Houston George Bush Intercontinental Airport – Baise Bama Airport

Distance arrow
8454
Miles
Distance arrow
13605
Kilometers
Distance arrow
7346
Nautical miles
Flight time duration
16 h 30 min
CO2 emission
1 065 kg

Search flights

Distance from Houston to Baise

There are several ways to calculate the distance from Houston to Baise. Here are two standard methods:

Vincenty's formula (applied above)
  • 8453.739 miles
  • 13604.974 kilometers
  • 7346.098 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8442.054 miles
  • 13586.168 kilometers
  • 7335.944 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Houston to Baise?

The estimated flight time from Houston George Bush Intercontinental Airport to Baise Bama Airport is 16 hours and 30 minutes.

Flight carbon footprint between Houston George Bush Intercontinental Airport (IAH) and Baise Bama Airport (AEB)

On average, flying from Houston to Baise generates about 1 065 kg of CO2 per passenger, and 1 065 kilograms equals 2 349 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Houston to Baise

See the map of the shortest flight path between Houston George Bush Intercontinental Airport (IAH) and Baise Bama Airport (AEB).

Airport information

Origin Houston George Bush Intercontinental Airport
City: Houston, TX
Country: United States Flag of United States
IATA Code: IAH
ICAO Code: KIAH
Coordinates: 29°59′3″N, 95°20′29″W
Destination Baise Bama Airport
City: Baise
Country: China Flag of China
IATA Code: AEB
ICAO Code: ZGBS
Coordinates: 23°43′14″N, 106°57′35″E