Air Miles Calculator logo

How far is Ulanhot from Miyakejima?

The distance between Miyakejima (Miyakejima Airport) and Ulanhot (Ulanhot Yilelite Airport) is 1240 miles / 1996 kilometers / 1078 nautical miles.

The driving distance from Miyakejima (MYE) to Ulanhot (HLH) is 1992 miles / 3206 kilometers, and travel time by car is about 87 hours 8 minutes.

Miyakejima Airport – Ulanhot Yilelite Airport

Distance arrow
1240
Miles
Distance arrow
1996
Kilometers
Distance arrow
1078
Nautical miles

Search flights

Distance from Miyakejima to Ulanhot

There are several ways to calculate the distance from Miyakejima to Ulanhot. Here are two standard methods:

Vincenty's formula (applied above)
  • 1240.294 miles
  • 1996.060 kilometers
  • 1077.786 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1239.371 miles
  • 1994.574 kilometers
  • 1076.984 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Miyakejima to Ulanhot?

The estimated flight time from Miyakejima Airport to Ulanhot Yilelite Airport is 2 hours and 50 minutes.

Flight carbon footprint between Miyakejima Airport (MYE) and Ulanhot Yilelite Airport (HLH)

On average, flying from Miyakejima to Ulanhot generates about 163 kg of CO2 per passenger, and 163 kilograms equals 359 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Miyakejima to Ulanhot

See the map of the shortest flight path between Miyakejima Airport (MYE) and Ulanhot Yilelite Airport (HLH).

Airport information

Origin Miyakejima Airport
City: Miyakejima
Country: Japan Flag of Japan
IATA Code: MYE
ICAO Code: RJTQ
Coordinates: 34°4′24″N, 139°33′35″E
Destination Ulanhot Yilelite Airport
City: Ulanhot
Country: China Flag of China
IATA Code: HLH
ICAO Code: ZBUL
Coordinates: 46°4′58″N, 122°1′1″E