How far is Lijiang from São Paulo?
The distance between São Paulo (São Paulo–Guarulhos International Airport) and Lijiang (Lijiang Sanyi International Airport) is 10354 miles / 16663 kilometers / 8997 nautical miles.
São Paulo–Guarulhos International Airport – Lijiang Sanyi International Airport
Search flights
Distance from São Paulo to Lijiang
There are several ways to calculate the distance from São Paulo to Lijiang. Here are two standard methods:
Vincenty's formula (applied above)- 10353.799 miles
- 16662.824 kilometers
- 8997.205 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 10347.538 miles
- 16652.748 kilometers
- 8991.765 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from São Paulo to Lijiang?
The estimated flight time from São Paulo–Guarulhos International Airport to Lijiang Sanyi International Airport is 20 hours and 6 minutes.
What is the time difference between São Paulo and Lijiang?
Flight carbon footprint between São Paulo–Guarulhos International Airport (GRU) and Lijiang Sanyi International Airport (LJG)
On average, flying from São Paulo to Lijiang generates about 1 357 kg of CO2 per passenger, and 1 357 kilograms equals 2 993 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from São Paulo to Lijiang
See the map of the shortest flight path between São Paulo–Guarulhos International Airport (GRU) and Lijiang Sanyi International Airport (LJG).
Airport information
Origin | São Paulo–Guarulhos International Airport |
---|---|
City: | São Paulo |
Country: | Brazil |
IATA Code: | GRU |
ICAO Code: | SBGR |
Coordinates: | 23°26′8″S, 46°28′23″W |
Destination | Lijiang Sanyi International Airport |
---|---|
City: | Lijiang |
Country: | China |
IATA Code: | LJG |
ICAO Code: | ZPLJ |
Coordinates: | 26°40′45″N, 100°14′44″E |