How far is Al-Ubayyid from Shanghai?
The distance between Shanghai (Shanghai Pudong International Airport) and Al-Ubayyid (El Obeid Airport) is 5850 miles / 9415 kilometers / 5084 nautical miles.
Shanghai Pudong International Airport – El Obeid Airport
Search flights
Distance from Shanghai to Al-Ubayyid
There are several ways to calculate the distance from Shanghai to Al-Ubayyid. Here are two standard methods:
Vincenty's formula (applied above)- 5850.181 miles
- 9414.953 kilometers
- 5083.668 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 5842.474 miles
- 9402.550 kilometers
- 5076.971 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Shanghai to Al-Ubayyid?
The estimated flight time from Shanghai Pudong International Airport to El Obeid Airport is 11 hours and 34 minutes.
What is the time difference between Shanghai and Al-Ubayyid?
Flight carbon footprint between Shanghai Pudong International Airport (PVG) and El Obeid Airport (EBD)
On average, flying from Shanghai to Al-Ubayyid generates about 696 kg of CO2 per passenger, and 696 kilograms equals 1 535 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Shanghai to Al-Ubayyid
See the map of the shortest flight path between Shanghai Pudong International Airport (PVG) and El Obeid Airport (EBD).
Airport information
Origin | Shanghai Pudong International Airport |
---|---|
City: | Shanghai |
Country: | China |
IATA Code: | PVG |
ICAO Code: | ZSPD |
Coordinates: | 31°8′36″N, 121°48′18″E |
Destination | El Obeid Airport |
---|---|
City: | Al-Ubayyid |
Country: | Sudan |
IATA Code: | EBD |
ICAO Code: | HSOB |
Coordinates: | 13°9′11″N, 30°13′57″E |