Air Miles Calculator logo

How far is Wanganui from Shanghai?

The distance between Shanghai (Shanghai Pudong International Airport) and Wanganui (Whanganui Airport) is 5964 miles / 9598 kilometers / 5182 nautical miles.

Shanghai Pudong International Airport – Whanganui Airport

Distance arrow
5964
Miles
Distance arrow
9598
Kilometers
Distance arrow
5182
Nautical miles

Search flights

Distance from Shanghai to Wanganui

There are several ways to calculate the distance from Shanghai to Wanganui. Here are two standard methods:

Vincenty's formula (applied above)
  • 5963.740 miles
  • 9597.709 kilometers
  • 5182.348 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5978.358 miles
  • 9621.234 kilometers
  • 5195.051 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Shanghai to Wanganui?

The estimated flight time from Shanghai Pudong International Airport to Whanganui Airport is 11 hours and 47 minutes.

Flight carbon footprint between Shanghai Pudong International Airport (PVG) and Whanganui Airport (WAG)

On average, flying from Shanghai to Wanganui generates about 712 kg of CO2 per passenger, and 712 kilograms equals 1 569 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Shanghai to Wanganui

See the map of the shortest flight path between Shanghai Pudong International Airport (PVG) and Whanganui Airport (WAG).

Airport information

Origin Shanghai Pudong International Airport
City: Shanghai
Country: China Flag of China
IATA Code: PVG
ICAO Code: ZSPD
Coordinates: 31°8′36″N, 121°48′18″E
Destination Whanganui Airport
City: Wanganui
Country: New Zealand Flag of New Zealand
IATA Code: WAG
ICAO Code: NZWU
Coordinates: 39°57′43″S, 175°1′29″E