Air Miles Calculator logo

How far is Barnaul from Chatham Island?

The distance between Chatham Island (Chatham Islands / Tuuta Airport) and Barnaul (Barnaul Airport) is 8907 miles / 14335 kilometers / 7740 nautical miles.

Chatham Islands / Tuuta Airport – Barnaul Airport

Distance arrow
8907
Miles
Distance arrow
14335
Kilometers
Distance arrow
7740
Nautical miles
Flight time duration
17 h 21 min
Time Difference
7 h 45 min
CO2 emission
1 133 kg

Search flights

Distance from Chatham Island to Barnaul

There are several ways to calculate the distance from Chatham Island to Barnaul. Here are two standard methods:

Vincenty's formula (applied above)
  • 8907.283 miles
  • 14334.882 kilometers
  • 7740.217 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 8918.123 miles
  • 14352.327 kilometers
  • 7749.637 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Chatham Island to Barnaul?

The estimated flight time from Chatham Islands / Tuuta Airport to Barnaul Airport is 17 hours and 21 minutes.

Flight carbon footprint between Chatham Islands / Tuuta Airport (CHT) and Barnaul Airport (BAX)

On average, flying from Chatham Island to Barnaul generates about 1 133 kg of CO2 per passenger, and 1 133 kilograms equals 2 498 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Chatham Island to Barnaul

See the map of the shortest flight path between Chatham Islands / Tuuta Airport (CHT) and Barnaul Airport (BAX).

Airport information

Origin Chatham Islands / Tuuta Airport
City: Chatham Island
Country: New Zealand Flag of New Zealand
IATA Code: CHT
ICAO Code: NZCI
Coordinates: 43°48′36″S, 176°27′25″W
Destination Barnaul Airport
City: Barnaul
Country: Russia Flag of Russia
IATA Code: BAX
ICAO Code: UNBB
Coordinates: 53°21′49″N, 83°32′18″E