How far is Long Beach, CA, from Shanghai?
The distance between Shanghai (Shanghai Hongqiao International Airport) and Long Beach (Long Beach Airport) is 6519 miles / 10491 kilometers / 5665 nautical miles.
Shanghai Hongqiao International Airport – Long Beach Airport
Search flights
Distance from Shanghai to Long Beach
There are several ways to calculate the distance from Shanghai to Long Beach. Here are two standard methods:
Vincenty's formula (applied above)- 6518.601 miles
- 10490.671 kilometers
- 5664.509 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 6505.771 miles
- 10470.023 kilometers
- 5653.360 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Shanghai to Long Beach?
The estimated flight time from Shanghai Hongqiao International Airport to Long Beach Airport is 12 hours and 50 minutes.
What is the time difference between Shanghai and Long Beach?
Flight carbon footprint between Shanghai Hongqiao International Airport (SHA) and Long Beach Airport (LGB)
On average, flying from Shanghai to Long Beach generates about 788 kg of CO2 per passenger, and 788 kilograms equals 1 736 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Shanghai to Long Beach
See the map of the shortest flight path between Shanghai Hongqiao International Airport (SHA) and Long Beach Airport (LGB).
Airport information
Origin | Shanghai Hongqiao International Airport |
---|---|
City: | Shanghai |
Country: | China ![]() |
IATA Code: | SHA |
ICAO Code: | ZSSS |
Coordinates: | 31°11′52″N, 121°20′9″E |
Destination | Long Beach Airport |
---|---|
City: | Long Beach, CA |
Country: | United States ![]() |
IATA Code: | LGB |
ICAO Code: | KLGB |
Coordinates: | 33°49′3″N, 118°9′7″W |