Air Miles Calculator logo

How far is Quanzhou from Longyearbyen?

The distance between Longyearbyen (Svalbard Airport, Longyear) and Quanzhou (Quanzhou Jinjiang International Airport) is 4730 miles / 7613 kilometers / 4110 nautical miles.

Svalbard Airport, Longyear – Quanzhou Jinjiang International Airport

Distance arrow
4730
Miles
Distance arrow
7613
Kilometers
Distance arrow
4110
Nautical miles

Search flights

Distance from Longyearbyen to Quanzhou

There are several ways to calculate the distance from Longyearbyen to Quanzhou. Here are two standard methods:

Vincenty's formula (applied above)
  • 4730.215 miles
  • 7612.543 kilometers
  • 4110.445 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4723.921 miles
  • 7602.415 kilometers
  • 4104.975 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Longyearbyen to Quanzhou?

The estimated flight time from Svalbard Airport, Longyear to Quanzhou Jinjiang International Airport is 9 hours and 27 minutes.

Flight carbon footprint between Svalbard Airport, Longyear (LYR) and Quanzhou Jinjiang International Airport (JJN)

On average, flying from Longyearbyen to Quanzhou generates about 549 kg of CO2 per passenger, and 549 kilograms equals 1 210 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Longyearbyen to Quanzhou

See the map of the shortest flight path between Svalbard Airport, Longyear (LYR) and Quanzhou Jinjiang International Airport (JJN).

Airport information

Origin Svalbard Airport, Longyear
City: Longyearbyen
Country: Norway Flag of Norway
IATA Code: LYR
ICAO Code: ENSB
Coordinates: 78°14′45″N, 15°27′56″E
Destination Quanzhou Jinjiang International Airport
City: Quanzhou
Country: China Flag of China
IATA Code: JJN
ICAO Code: ZSQZ
Coordinates: 24°47′47″N, 118°35′23″E