How far is Shanghai from Yanji?
The distance between Yanji (Yanji Chaoyangchuan International Airport) and Shanghai (Shanghai Hongqiao International Airport) is 921 miles / 1482 kilometers / 800 nautical miles.
The driving distance from Yanji (YNJ) to Shanghai (SHA) is 1454 miles / 2340 kilometers, and travel time by car is about 26 hours 29 minutes.
Yanji Chaoyangchuan International Airport – Shanghai Hongqiao International Airport
Search flights
Distance from Yanji to Shanghai
There are several ways to calculate the distance from Yanji to Shanghai. Here are two standard methods:
Vincenty's formula (applied above)- 921.146 miles
- 1482.441 kilometers
- 800.454 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 922.014 miles
- 1483.837 kilometers
- 801.208 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Yanji to Shanghai?
The estimated flight time from Yanji Chaoyangchuan International Airport to Shanghai Hongqiao International Airport is 2 hours and 14 minutes.
What is the time difference between Yanji and Shanghai?
Flight carbon footprint between Yanji Chaoyangchuan International Airport (YNJ) and Shanghai Hongqiao International Airport (SHA)
On average, flying from Yanji to Shanghai generates about 145 kg of CO2 per passenger, and 145 kilograms equals 320 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Yanji to Shanghai
See the map of the shortest flight path between Yanji Chaoyangchuan International Airport (YNJ) and Shanghai Hongqiao International Airport (SHA).
Airport information
Origin | Yanji Chaoyangchuan International Airport |
---|---|
City: | Yanji |
Country: | China |
IATA Code: | YNJ |
ICAO Code: | ZYYJ |
Coordinates: | 42°52′58″N, 129°27′3″E |
Destination | Shanghai Hongqiao International Airport |
---|---|
City: | Shanghai |
Country: | China |
IATA Code: | SHA |
ICAO Code: | ZSSS |
Coordinates: | 31°11′52″N, 121°20′9″E |