Air Miles Calculator logo

How far is Longyan from Bhuj?

The distance between Bhuj (Bhuj Airport) and Longyan (Longyan Guanzhishan Airport) is 2954 miles / 4755 kilometers / 2567 nautical miles.

The driving distance from Bhuj (BHJ) to Longyan (LCX) is 3895 miles / 6268 kilometers, and travel time by car is about 76 hours 57 minutes.

Bhuj Airport – Longyan Guanzhishan Airport

Distance arrow
2954
Miles
Distance arrow
4755
Kilometers
Distance arrow
2567
Nautical miles
Flight time duration
6 h 5 min
Time Difference
2 h 30 min
CO2 emission
329 kg

Search flights

Distance from Bhuj to Longyan

There are several ways to calculate the distance from Bhuj to Longyan. Here are two standard methods:

Vincenty's formula (applied above)
  • 2954.354 miles
  • 4754.572 kilometers
  • 2567.264 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2949.411 miles
  • 4746.617 kilometers
  • 2562.968 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bhuj to Longyan?

The estimated flight time from Bhuj Airport to Longyan Guanzhishan Airport is 6 hours and 5 minutes.

Flight carbon footprint between Bhuj Airport (BHJ) and Longyan Guanzhishan Airport (LCX)

On average, flying from Bhuj to Longyan generates about 329 kg of CO2 per passenger, and 329 kilograms equals 725 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bhuj to Longyan

See the map of the shortest flight path between Bhuj Airport (BHJ) and Longyan Guanzhishan Airport (LCX).

Airport information

Origin Bhuj Airport
City: Bhuj
Country: India Flag of India
IATA Code: BHJ
ICAO Code: VABJ
Coordinates: 23°17′16″N, 69°40′12″E
Destination Longyan Guanzhishan Airport
City: Longyan
Country: China Flag of China
IATA Code: LCX
ICAO Code: ZSLD
Coordinates: 25°40′28″N, 116°44′49″E