Air Miles Calculator logo

How far is Belgaum from Pittsburgh, PA?

The distance between Pittsburgh (Pittsburgh International Airport) and Belgaum (Belgaum Airport) is 8233 miles / 13249 kilometers / 7154 nautical miles.

Pittsburgh International Airport – Belgaum Airport

Distance arrow
8233
Miles
Distance arrow
13249
Kilometers
Distance arrow
7154
Nautical miles
Flight time duration
16 h 5 min
Time Difference
10 h 30 min
CO2 emission
1 033 kg

Search flights

Distance from Pittsburgh to Belgaum

There are several ways to calculate the distance from Pittsburgh to Belgaum. Here are two standard methods:

Vincenty's formula (applied above)
  • 8232.601 miles
  • 13249.087 kilometers
  • 7153.935 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8221.895 miles
  • 13231.857 kilometers
  • 7144.631 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Pittsburgh to Belgaum?

The estimated flight time from Pittsburgh International Airport to Belgaum Airport is 16 hours and 5 minutes.

Flight carbon footprint between Pittsburgh International Airport (PIT) and Belgaum Airport (IXG)

On average, flying from Pittsburgh to Belgaum generates about 1 033 kg of CO2 per passenger, and 1 033 kilograms equals 2 276 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Pittsburgh to Belgaum

See the map of the shortest flight path between Pittsburgh International Airport (PIT) and Belgaum Airport (IXG).

Airport information

Origin Pittsburgh International Airport
City: Pittsburgh, PA
Country: United States Flag of United States
IATA Code: PIT
ICAO Code: KPIT
Coordinates: 40°29′29″N, 80°13′58″W
Destination Belgaum Airport
City: Belgaum
Country: India Flag of India
IATA Code: IXG
ICAO Code: VABM
Coordinates: 15°51′33″N, 74°37′5″E