How far is Palembang from Yangon?
The distance between Yangon (Yangon International Airport) and Palembang (Sultan Mahmud Badaruddin II International Airport) is 1482 miles / 2384 kilometers / 1287 nautical miles.
Yangon International Airport – Sultan Mahmud Badaruddin II International Airport
Search flights
Distance from Yangon to Palembang
There are several ways to calculate the distance from Yangon to Palembang. Here are two standard methods:
Vincenty's formula (applied above)- 1481.561 miles
- 2384.342 kilometers
- 1287.441 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1488.000 miles
- 2394.704 kilometers
- 1293.037 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Yangon to Palembang?
The estimated flight time from Yangon International Airport to Sultan Mahmud Badaruddin II International Airport is 3 hours and 18 minutes.
What is the time difference between Yangon and Palembang?
Flight carbon footprint between Yangon International Airport (RGN) and Sultan Mahmud Badaruddin II International Airport (PLM)
On average, flying from Yangon to Palembang generates about 178 kg of CO2 per passenger, and 178 kilograms equals 393 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Yangon to Palembang
See the map of the shortest flight path between Yangon International Airport (RGN) and Sultan Mahmud Badaruddin II International Airport (PLM).
Airport information
Origin | Yangon International Airport |
---|---|
City: | Yangon |
Country: | Burma |
IATA Code: | RGN |
ICAO Code: | VYYY |
Coordinates: | 16°54′26″N, 96°7′59″E |
Destination | Sultan Mahmud Badaruddin II International Airport |
---|---|
City: | Palembang |
Country: | Indonesia |
IATA Code: | PLM |
ICAO Code: | WIPP |
Coordinates: | 2°53′53″S, 104°41′59″E |