Air Miles Calculator logo

How far is Lijiang from Rome?

The distance between Rome (Ciampino–G. B. Pastine International Airport) and Lijiang (Lijiang Sanyi International Airport) is 4911 miles / 7903 kilometers / 4268 nautical miles.

Ciampino–G. B. Pastine International Airport – Lijiang Sanyi International Airport

Distance arrow
4911
Miles
Distance arrow
7903
Kilometers
Distance arrow
4268
Nautical miles

Search flights

Distance from Rome to Lijiang

There are several ways to calculate the distance from Rome to Lijiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 4910.957 miles
  • 7903.419 kilometers
  • 4267.505 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4901.446 miles
  • 7888.112 kilometers
  • 4259.240 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Rome to Lijiang?

The estimated flight time from Ciampino–G. B. Pastine International Airport to Lijiang Sanyi International Airport is 9 hours and 47 minutes.

Flight carbon footprint between Ciampino–G. B. Pastine International Airport (CIA) and Lijiang Sanyi International Airport (LJG)

On average, flying from Rome to Lijiang generates about 572 kg of CO2 per passenger, and 572 kilograms equals 1 262 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Rome to Lijiang

See the map of the shortest flight path between Ciampino–G. B. Pastine International Airport (CIA) and Lijiang Sanyi International Airport (LJG).

Airport information

Origin Ciampino–G. B. Pastine International Airport
City: Rome
Country: Italy Flag of Italy
IATA Code: CIA
ICAO Code: LIRA
Coordinates: 41°47′57″N, 12°35′41″E
Destination Lijiang Sanyi International Airport
City: Lijiang
Country: China Flag of China
IATA Code: LJG
ICAO Code: ZPLJ
Coordinates: 26°40′45″N, 100°14′44″E