Air Miles Calculator logo

How far is Shanghai from Weifang?

The distance between Weifang (Weifang Nanyuan Airport) and Shanghai (Shanghai Hongqiao International Airport) is 397 miles / 638 kilometers / 345 nautical miles.

The driving distance from Weifang (WEF) to Shanghai (SHA) is 450 miles / 724 kilometers, and travel time by car is about 8 hours 18 minutes.

Weifang Nanyuan Airport – Shanghai Hongqiao International Airport

Distance arrow
397
Miles
Distance arrow
638
Kilometers
Distance arrow
345
Nautical miles

Search flights

Distance from Weifang to Shanghai

There are several ways to calculate the distance from Weifang to Shanghai. Here are two standard methods:

Vincenty's formula (applied above)
  • 396.528 miles
  • 638.150 kilometers
  • 344.573 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 397.318 miles
  • 639.421 kilometers
  • 345.260 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Weifang to Shanghai?

The estimated flight time from Weifang Nanyuan Airport to Shanghai Hongqiao International Airport is 1 hour and 15 minutes.

What is the time difference between Weifang and Shanghai?

There is no time difference between Weifang and Shanghai.

Flight carbon footprint between Weifang Nanyuan Airport (WEF) and Shanghai Hongqiao International Airport (SHA)

On average, flying from Weifang to Shanghai generates about 83 kg of CO2 per passenger, and 83 kilograms equals 184 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Weifang to Shanghai

See the map of the shortest flight path between Weifang Nanyuan Airport (WEF) and Shanghai Hongqiao International Airport (SHA).

Airport information

Origin Weifang Nanyuan Airport
City: Weifang
Country: China Flag of China
IATA Code: WEF
ICAO Code: ZSWF
Coordinates: 36°38′48″N, 119°7′8″E
Destination Shanghai Hongqiao International Airport
City: Shanghai
Country: China Flag of China
IATA Code: SHA
ICAO Code: ZSSS
Coordinates: 31°11′52″N, 121°20′9″E