Air Miles Calculator logo

How far is Qianjiang from Lampang?

The distance between Lampang (Lampang Airport) and Qianjiang (Qianjiang Wulingshan Airport) is 972 miles / 1565 kilometers / 845 nautical miles.

The driving distance from Lampang (LPT) to Qianjiang (JIQ) is 1304 miles / 2099 kilometers, and travel time by car is about 25 hours 25 minutes.

Lampang Airport – Qianjiang Wulingshan Airport

Distance arrow
972
Miles
Distance arrow
1565
Kilometers
Distance arrow
845
Nautical miles

Search flights

Distance from Lampang to Qianjiang

There are several ways to calculate the distance from Lampang to Qianjiang. Here are two standard methods:

Vincenty's formula (applied above)
  • 972.154 miles
  • 1564.530 kilometers
  • 844.778 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 973.986 miles
  • 1567.479 kilometers
  • 846.371 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Lampang to Qianjiang?

The estimated flight time from Lampang Airport to Qianjiang Wulingshan Airport is 2 hours and 20 minutes.

Flight carbon footprint between Lampang Airport (LPT) and Qianjiang Wulingshan Airport (JIQ)

On average, flying from Lampang to Qianjiang generates about 149 kg of CO2 per passenger, and 149 kilograms equals 328 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Lampang to Qianjiang

See the map of the shortest flight path between Lampang Airport (LPT) and Qianjiang Wulingshan Airport (JIQ).

Airport information

Origin Lampang Airport
City: Lampang
Country: Thailand Flag of Thailand
IATA Code: LPT
ICAO Code: VTCL
Coordinates: 18°16′15″N, 99°30′15″E
Destination Qianjiang Wulingshan Airport
City: Qianjiang
Country: China Flag of China
IATA Code: JIQ
ICAO Code: ZUQJ
Coordinates: 29°30′47″N, 108°49′51″E