Air Miles Calculator logo

How far is Benbecula from Houston, TX?

The distance between Houston (Houston George Bush Intercontinental Airport) and Benbecula (Benbecula Airport) is 4433 miles / 7135 kilometers / 3852 nautical miles.

Houston George Bush Intercontinental Airport – Benbecula Airport

Distance arrow
4433
Miles
Distance arrow
7135
Kilometers
Distance arrow
3852
Nautical miles

Search flights

Distance from Houston to Benbecula

There are several ways to calculate the distance from Houston to Benbecula. Here are two standard methods:

Vincenty's formula (applied above)
  • 4433.228 miles
  • 7134.589 kilometers
  • 3852.370 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4424.353 miles
  • 7120.306 kilometers
  • 3844.657 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Houston to Benbecula?

The estimated flight time from Houston George Bush Intercontinental Airport to Benbecula Airport is 8 hours and 53 minutes.

Flight carbon footprint between Houston George Bush Intercontinental Airport (IAH) and Benbecula Airport (BEB)

On average, flying from Houston to Benbecula generates about 511 kg of CO2 per passenger, and 511 kilograms equals 1 126 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Houston to Benbecula

See the map of the shortest flight path between Houston George Bush Intercontinental Airport (IAH) and Benbecula Airport (BEB).

Airport information

Origin Houston George Bush Intercontinental Airport
City: Houston, TX
Country: United States Flag of United States
IATA Code: IAH
ICAO Code: KIAH
Coordinates: 29°59′3″N, 95°20′29″W
Destination Benbecula Airport
City: Benbecula
Country: United Kingdom Flag of United Kingdom
IATA Code: BEB
ICAO Code: EGPL
Coordinates: 57°28′51″N, 7°21′46″W