Air Miles Calculator logo

How far is Jixi from Simikot?

The distance between Simikot (Simikot Airport) and Jixi (Jixi Xingkaihu Airport) is 2855 miles / 4595 kilometers / 2481 nautical miles.

The driving distance from Simikot (IMK) to Jixi (JXA) is 3902 miles / 6279 kilometers, and travel time by car is about 75 hours 4 minutes.

Simikot Airport – Jixi Xingkaihu Airport

Distance arrow
2855
Miles
Distance arrow
4595
Kilometers
Distance arrow
2481
Nautical miles
Flight time duration
5 h 54 min
Time Difference
2 h 15 min
CO2 emission
317 kg

Search flights

Distance from Simikot to Jixi

There are several ways to calculate the distance from Simikot to Jixi. Here are two standard methods:

Vincenty's formula (applied above)
  • 2855.153 miles
  • 4594.923 kilometers
  • 2481.060 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2850.079 miles
  • 4586.758 kilometers
  • 2476.651 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Simikot to Jixi?

The estimated flight time from Simikot Airport to Jixi Xingkaihu Airport is 5 hours and 54 minutes.

Flight carbon footprint between Simikot Airport (IMK) and Jixi Xingkaihu Airport (JXA)

On average, flying from Simikot to Jixi generates about 317 kg of CO2 per passenger, and 317 kilograms equals 699 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Simikot to Jixi

See the map of the shortest flight path between Simikot Airport (IMK) and Jixi Xingkaihu Airport (JXA).

Airport information

Origin Simikot Airport
City: Simikot
Country: Nepal Flag of Nepal
IATA Code: IMK
ICAO Code: VNST
Coordinates: 29°58′15″N, 81°49′8″E
Destination Jixi Xingkaihu Airport
City: Jixi
Country: China Flag of China
IATA Code: JXA
ICAO Code: ZYJX
Coordinates: 45°17′34″N, 131°11′34″E