Air Miles Calculator logo

How far is Petrolina from Houston, TX?

The distance between Houston (Houston George Bush Intercontinental Airport) and Petrolina (Petrolina Airport) is 4535 miles / 7298 kilometers / 3941 nautical miles.

Houston George Bush Intercontinental Airport – Petrolina Airport

Distance arrow
4535
Miles
Distance arrow
7298
Kilometers
Distance arrow
3941
Nautical miles

Search flights

Distance from Houston to Petrolina

There are several ways to calculate the distance from Houston to Petrolina. Here are two standard methods:

Vincenty's formula (applied above)
  • 4534.683 miles
  • 7297.865 kilometers
  • 3940.532 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4538.765 miles
  • 7304.435 kilometers
  • 3944.079 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Houston to Petrolina?

The estimated flight time from Houston George Bush Intercontinental Airport to Petrolina Airport is 9 hours and 5 minutes.

Flight carbon footprint between Houston George Bush Intercontinental Airport (IAH) and Petrolina Airport (PNZ)

On average, flying from Houston to Petrolina generates about 524 kg of CO2 per passenger, and 524 kilograms equals 1 155 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Houston to Petrolina

See the map of the shortest flight path between Houston George Bush Intercontinental Airport (IAH) and Petrolina Airport (PNZ).

Airport information

Origin Houston George Bush Intercontinental Airport
City: Houston, TX
Country: United States Flag of United States
IATA Code: IAH
ICAO Code: KIAH
Coordinates: 29°59′3″N, 95°20′29″W
Destination Petrolina Airport
City: Petrolina
Country: Brazil Flag of Brazil
IATA Code: PNZ
ICAO Code: SBPL
Coordinates: 9°21′44″S, 40°34′8″W