Air Miles Calculator logo

How far is Yichun from Bima?

The distance between Bima (Sultan Muhammad Salahudin Airport) and Yichun (Yichun Mingyueshan Airport) is 2516 miles / 4048 kilometers / 2186 nautical miles.

Sultan Muhammad Salahudin Airport – Yichun Mingyueshan Airport

Distance arrow
2516
Miles
Distance arrow
4048
Kilometers
Distance arrow
2186
Nautical miles

Search flights

Distance from Bima to Yichun

There are several ways to calculate the distance from Bima to Yichun. Here are two standard methods:

Vincenty's formula (applied above)
  • 2515.618 miles
  • 4048.495 kilometers
  • 2186.012 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2528.026 miles
  • 4068.464 kilometers
  • 2196.795 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bima to Yichun?

The estimated flight time from Sultan Muhammad Salahudin Airport to Yichun Mingyueshan Airport is 5 hours and 15 minutes.

What is the time difference between Bima and Yichun?

There is no time difference between Bima and Yichun.

Flight carbon footprint between Sultan Muhammad Salahudin Airport (BMU) and Yichun Mingyueshan Airport (YIC)

On average, flying from Bima to Yichun generates about 277 kg of CO2 per passenger, and 277 kilograms equals 611 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Bima to Yichun

See the map of the shortest flight path between Sultan Muhammad Salahudin Airport (BMU) and Yichun Mingyueshan Airport (YIC).

Airport information

Origin Sultan Muhammad Salahudin Airport
City: Bima
Country: Indonesia Flag of Indonesia
IATA Code: BMU
ICAO Code: WADB
Coordinates: 8°32′22″S, 118°41′13″E
Destination Yichun Mingyueshan Airport
City: Yichun
Country: China Flag of China
IATA Code: YIC
ICAO Code: ZSYC
Coordinates: 27°48′9″N, 114°18′22″E