Air Miles Calculator logo

How far is Semey from Chatham Island?

The distance between Chatham Island (Chatham Islands / Tuuta Airport) and Semey (Semey Airport) is 8953 miles / 14408 kilometers / 7780 nautical miles.

Chatham Islands / Tuuta Airport – Semey Airport

Distance arrow
8953
Miles
Distance arrow
14408
Kilometers
Distance arrow
7780
Nautical miles
Flight time duration
17 h 27 min
Time Difference
8 h 45 min
CO2 emission
1 140 kg

Search flights

Distance from Chatham Island to Semey

There are several ways to calculate the distance from Chatham Island to Semey. Here are two standard methods:

Vincenty's formula (applied above)
  • 8952.987 miles
  • 14408.437 kilometers
  • 7779.933 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8962.734 miles
  • 14424.122 kilometers
  • 7788.403 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Chatham Island to Semey?

The estimated flight time from Chatham Islands / Tuuta Airport to Semey Airport is 17 hours and 27 minutes.

Flight carbon footprint between Chatham Islands / Tuuta Airport (CHT) and Semey Airport (PLX)

On average, flying from Chatham Island to Semey generates about 1 140 kg of CO2 per passenger, and 1 140 kilograms equals 2 514 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Chatham Island to Semey

See the map of the shortest flight path between Chatham Islands / Tuuta Airport (CHT) and Semey Airport (PLX).

Airport information

Origin Chatham Islands / Tuuta Airport
City: Chatham Island
Country: New Zealand Flag of New Zealand
IATA Code: CHT
ICAO Code: NZCI
Coordinates: 43°48′36″S, 176°27′25″W
Destination Semey Airport
City: Semey
Country: Kazakhstan Flag of Kazakhstan
IATA Code: PLX
ICAO Code: UASS
Coordinates: 50°21′4″N, 80°14′3″E