How far is Lijiang from Boston, MA?
The distance between Boston (Logan International Airport) and Lijiang (Lijiang Sanyi International Airport) is 7647 miles / 12307 kilometers / 6645 nautical miles.
Logan International Airport – Lijiang Sanyi International Airport
Search flights
Distance from Boston to Lijiang
There are several ways to calculate the distance from Boston to Lijiang. Here are two standard methods:
Vincenty's formula (applied above)- 7647.479 miles
- 12307.425 kilometers
- 6645.478 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 7633.852 miles
- 12285.493 kilometers
- 6633.636 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Boston to Lijiang?
The estimated flight time from Logan International Airport to Lijiang Sanyi International Airport is 14 hours and 58 minutes.
What is the time difference between Boston and Lijiang?
The time difference between Boston and Lijiang is 13 hours. Lijiang is 13 hours ahead of Boston.
Flight carbon footprint between Logan International Airport (BOS) and Lijiang Sanyi International Airport (LJG)
On average, flying from Boston to Lijiang generates about 947 kg of CO2 per passenger, and 947 kilograms equals 2 088 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Boston to Lijiang
See the map of the shortest flight path between Logan International Airport (BOS) and Lijiang Sanyi International Airport (LJG).
Airport information
Origin | Logan International Airport |
---|---|
City: | Boston, MA |
Country: | United States |
IATA Code: | BOS |
ICAO Code: | KBOS |
Coordinates: | 42°21′51″N, 71°0′18″W |
Destination | Lijiang Sanyi International Airport |
---|---|
City: | Lijiang |
Country: | China |
IATA Code: | LJG |
ICAO Code: | ZPLJ |
Coordinates: | 26°40′45″N, 100°14′44″E |