Air Miles Calculator logo

How far is Nanaimo from Shanghai?

The distance between Shanghai (Shanghai Hongqiao International Airport) and Nanaimo (Nanaimo Airport) is 5610 miles / 9029 kilometers / 4875 nautical miles.

Shanghai Hongqiao International Airport – Nanaimo Airport

Distance arrow
5610
Miles
Distance arrow
9029
Kilometers
Distance arrow
4875
Nautical miles

Search flights

Distance from Shanghai to Nanaimo

There are several ways to calculate the distance from Shanghai to Nanaimo. Here are two standard methods:

Vincenty's formula (applied above)
  • 5610.186 miles
  • 9028.719 kilometers
  • 4875.118 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5597.675 miles
  • 9008.585 kilometers
  • 4864.247 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Shanghai to Nanaimo?

The estimated flight time from Shanghai Hongqiao International Airport to Nanaimo Airport is 11 hours and 7 minutes.

Flight carbon footprint between Shanghai Hongqiao International Airport (SHA) and Nanaimo Airport (YCD)

On average, flying from Shanghai to Nanaimo generates about 664 kg of CO2 per passenger, and 664 kilograms equals 1 464 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Shanghai to Nanaimo

See the map of the shortest flight path between Shanghai Hongqiao International Airport (SHA) and Nanaimo Airport (YCD).

Airport information

Origin Shanghai Hongqiao International Airport
City: Shanghai
Country: China Flag of China
IATA Code: SHA
ICAO Code: ZSSS
Coordinates: 31°11′52″N, 121°20′9″E
Destination Nanaimo Airport
City: Nanaimo
Country: Canada Flag of Canada
IATA Code: YCD
ICAO Code: CYCD
Coordinates: 49°3′8″N, 123°52′12″W