Air Miles Calculator logo

How far is Shanghai from Ulaangom?

The distance between Ulaangom (Ulaangom Airport) and Shanghai (Shanghai Hongqiao International Airport) is 1998 miles / 3215 kilometers / 1736 nautical miles.

The driving distance from Ulaangom (ULO) to Shanghai (SHA) is 2430 miles / 3910 kilometers, and travel time by car is about 48 hours 25 minutes.

Ulaangom Airport – Shanghai Hongqiao International Airport

Distance arrow
1998
Miles
Distance arrow
3215
Kilometers
Distance arrow
1736
Nautical miles

Search flights

Distance from Ulaangom to Shanghai

There are several ways to calculate the distance from Ulaangom to Shanghai. Here are two standard methods:

Vincenty's formula (applied above)
  • 1997.778 miles
  • 3215.113 kilometers
  • 1736.022 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1996.013 miles
  • 3212.271 kilometers
  • 1734.488 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Ulaangom to Shanghai?

The estimated flight time from Ulaangom Airport to Shanghai Hongqiao International Airport is 4 hours and 16 minutes.

Flight carbon footprint between Ulaangom Airport (ULO) and Shanghai Hongqiao International Airport (SHA)

On average, flying from Ulaangom to Shanghai generates about 218 kg of CO2 per passenger, and 218 kilograms equals 480 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Ulaangom to Shanghai

See the map of the shortest flight path between Ulaangom Airport (ULO) and Shanghai Hongqiao International Airport (SHA).

Airport information

Origin Ulaangom Airport
City: Ulaangom
Country: Mongolia Flag of Mongolia
IATA Code: ULO
ICAO Code: ZMUG
Coordinates: 50°3′59″N, 91°56′17″E
Destination Shanghai Hongqiao International Airport
City: Shanghai
Country: China Flag of China
IATA Code: SHA
ICAO Code: ZSSS
Coordinates: 31°11′52″N, 121°20′9″E