Air Miles Calculator logo

How far is Bandar Mahshahr from Sabetta?

The distance between Sabetta (Sabetta International Airport) and Bandar Mahshahr (Mahshahr Airport) is 2942 miles / 4734 kilometers / 2556 nautical miles.

Sabetta International Airport – Mahshahr Airport

Distance arrow
2942
Miles
Distance arrow
4734
Kilometers
Distance arrow
2556
Nautical miles
Flight time duration
6 h 4 min
Time Difference
1 h 30 min
CO2 emission
327 kg

Search flights

Distance from Sabetta to Bandar Mahshahr

There are several ways to calculate the distance from Sabetta to Bandar Mahshahr. Here are two standard methods:

Vincenty's formula (applied above)
  • 2941.590 miles
  • 4734.030 kilometers
  • 2556.171 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2939.826 miles
  • 4731.191 kilometers
  • 2554.639 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Sabetta to Bandar Mahshahr?

The estimated flight time from Sabetta International Airport to Mahshahr Airport is 6 hours and 4 minutes.

Flight carbon footprint between Sabetta International Airport (SBT) and Mahshahr Airport (MRX)

On average, flying from Sabetta to Bandar Mahshahr generates about 327 kg of CO2 per passenger, and 327 kilograms equals 721 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Sabetta to Bandar Mahshahr

See the map of the shortest flight path between Sabetta International Airport (SBT) and Mahshahr Airport (MRX).

Airport information

Origin Sabetta International Airport
City: Sabetta
Country: Russia Flag of Russia
IATA Code: SBT
ICAO Code: USDA
Coordinates: 71°13′9″N, 72°3′7″E
Destination Mahshahr Airport
City: Bandar Mahshahr
Country: Iran Flag of Iran
IATA Code: MRX
ICAO Code: OIAM
Coordinates: 30°33′22″N, 49°9′6″E