How far is Sanandaj from Bandar Mahshahr?
The distance between Bandar Mahshahr (Mahshahr Airport) and Sanandaj (Sanandaj Airport) is 346 miles / 557 kilometers / 301 nautical miles.
The driving distance from Bandar Mahshahr (MRX) to Sanandaj (SDG) is 444 miles / 714 kilometers, and travel time by car is about 9 hours 12 minutes.
Mahshahr Airport – Sanandaj Airport
Search flights
Distance from Bandar Mahshahr to Sanandaj
There are several ways to calculate the distance from Bandar Mahshahr to Sanandaj. Here are two standard methods:
Vincenty's formula (applied above)- 346.322 miles
- 557.351 kilometers
- 300.945 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 347.022 miles
- 558.478 kilometers
- 301.554 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Bandar Mahshahr to Sanandaj?
The estimated flight time from Mahshahr Airport to Sanandaj Airport is 1 hour and 9 minutes.
What is the time difference between Bandar Mahshahr and Sanandaj?
There is no time difference between Bandar Mahshahr and Sanandaj.
Flight carbon footprint between Mahshahr Airport (MRX) and Sanandaj Airport (SDG)
On average, flying from Bandar Mahshahr to Sanandaj generates about 76 kg of CO2 per passenger, and 76 kilograms equals 168 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Bandar Mahshahr to Sanandaj
See the map of the shortest flight path between Mahshahr Airport (MRX) and Sanandaj Airport (SDG).
Airport information
Origin | Mahshahr Airport |
---|---|
City: | Bandar Mahshahr |
Country: | Iran |
IATA Code: | MRX |
ICAO Code: | OIAM |
Coordinates: | 30°33′22″N, 49°9′6″E |
Destination | Sanandaj Airport |
---|---|
City: | Sanandaj |
Country: | Iran |
IATA Code: | SDG |
ICAO Code: | OICS |
Coordinates: | 35°14′45″N, 47°0′33″E |