How far is Long Seridan from Shanghai?
The distance between Shanghai (Shanghai Hongqiao International Airport) and Long Seridan (Long Seridan Airport) is 1917 miles / 3085 kilometers / 1666 nautical miles.
Shanghai Hongqiao International Airport – Long Seridan Airport
Search flights
Distance from Shanghai to Long Seridan
There are several ways to calculate the distance from Shanghai to Long Seridan. Here are two standard methods:
Vincenty's formula (applied above)- 1917.211 miles
- 3085.453 kilometers
- 1666.011 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 1925.390 miles
- 3098.615 kilometers
- 1673.118 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Shanghai to Long Seridan?
The estimated flight time from Shanghai Hongqiao International Airport to Long Seridan Airport is 4 hours and 7 minutes.
What is the time difference between Shanghai and Long Seridan?
There is no time difference between Shanghai and Long Seridan.
Flight carbon footprint between Shanghai Hongqiao International Airport (SHA) and Long Seridan Airport (ODN)
On average, flying from Shanghai to Long Seridan generates about 210 kg of CO2 per passenger, and 210 kilograms equals 463 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Shanghai to Long Seridan
See the map of the shortest flight path between Shanghai Hongqiao International Airport (SHA) and Long Seridan Airport (ODN).
Airport information
Origin | Shanghai Hongqiao International Airport |
---|---|
City: | Shanghai |
Country: | China |
IATA Code: | SHA |
ICAO Code: | ZSSS |
Coordinates: | 31°11′52″N, 121°20′9″E |
Destination | Long Seridan Airport |
---|---|
City: | Long Seridan |
Country: | Malaysia |
IATA Code: | ODN |
ICAO Code: | WBGI |
Coordinates: | 3°58′1″N, 115°3′0″E |