Air Miles Calculator logo

How far is Pathein from Pittsburgh, PA?

The distance between Pittsburgh (Pittsburgh International Airport) and Pathein (Pathein Airport) is 8475 miles / 13639 kilometers / 7365 nautical miles.

Pittsburgh International Airport – Pathein Airport

Distance arrow
8475
Miles
Distance arrow
13639
Kilometers
Distance arrow
7365
Nautical miles
Flight time duration
16 h 32 min
Time Difference
11 h 30 min
CO2 emission
1 068 kg

Search flights

Distance from Pittsburgh to Pathein

There are several ways to calculate the distance from Pittsburgh to Pathein. Here are two standard methods:

Vincenty's formula (applied above)
  • 8474.984 miles
  • 13639.164 kilometers
  • 7364.559 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8464.349 miles
  • 13622.049 kilometers
  • 7355.318 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Pittsburgh to Pathein?

The estimated flight time from Pittsburgh International Airport to Pathein Airport is 16 hours and 32 minutes.

Flight carbon footprint between Pittsburgh International Airport (PIT) and Pathein Airport (BSX)

On average, flying from Pittsburgh to Pathein generates about 1 068 kg of CO2 per passenger, and 1 068 kilograms equals 2 356 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Pittsburgh to Pathein

See the map of the shortest flight path between Pittsburgh International Airport (PIT) and Pathein Airport (BSX).

Airport information

Origin Pittsburgh International Airport
City: Pittsburgh, PA
Country: United States Flag of United States
IATA Code: PIT
ICAO Code: KPIT
Coordinates: 40°29′29″N, 80°13′58″W
Destination Pathein Airport
City: Pathein
Country: Burma Flag of Burma
IATA Code: BSX
ICAO Code: VYPN
Coordinates: 16°48′54″N, 94°46′47″E