Air Miles Calculator logo

How far is Yichun from Bhuj?

The distance between Bhuj (Bhuj Airport) and Yichun (Yichun Mingyueshan Airport) is 2790 miles / 4490 kilometers / 2424 nautical miles.

The driving distance from Bhuj (BHJ) to Yichun (YIC) is 3698 miles / 5951 kilometers, and travel time by car is about 73 hours 25 minutes.

Bhuj Airport – Yichun Mingyueshan Airport

Distance arrow
2790
Miles
Distance arrow
4490
Kilometers
Distance arrow
2424
Nautical miles
Flight time duration
5 h 46 min
Time Difference
2 h 30 min
CO2 emission
309 kg

Search flights

Distance from Bhuj to Yichun

There are several ways to calculate the distance from Bhuj to Yichun. Here are two standard methods:

Vincenty's formula (applied above)
  • 2789.783 miles
  • 4489.720 kilometers
  • 2424.255 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2785.122 miles
  • 4482.220 kilometers
  • 2420.205 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bhuj to Yichun?

The estimated flight time from Bhuj Airport to Yichun Mingyueshan Airport is 5 hours and 46 minutes.

Flight carbon footprint between Bhuj Airport (BHJ) and Yichun Mingyueshan Airport (YIC)

On average, flying from Bhuj to Yichun generates about 309 kg of CO2 per passenger, and 309 kilograms equals 682 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bhuj to Yichun

See the map of the shortest flight path between Bhuj Airport (BHJ) and Yichun Mingyueshan Airport (YIC).

Airport information

Origin Bhuj Airport
City: Bhuj
Country: India Flag of India
IATA Code: BHJ
ICAO Code: VABJ
Coordinates: 23°17′16″N, 69°40′12″E
Destination Yichun Mingyueshan Airport
City: Yichun
Country: China Flag of China
IATA Code: YIC
ICAO Code: ZSYC
Coordinates: 27°48′9″N, 114°18′22″E