Air Miles Calculator logo

How far is Bathurst from Columbus, OH?

The distance between Columbus (John Glenn Columbus International Airport) and Bathurst (Bathurst Airport) is 9532 miles / 15340 kilometers / 8283 nautical miles.

John Glenn Columbus International Airport – Bathurst Airport

Distance arrow
9532
Miles
Distance arrow
15340
Kilometers
Distance arrow
8283
Nautical miles
Flight time duration
18 h 32 min
CO2 emission
1 229 kg

Search flights

Distance from Columbus to Bathurst

There are several ways to calculate the distance from Columbus to Bathurst. Here are two standard methods:

Vincenty's formula (applied above)
  • 9531.629 miles
  • 15339.669 kilometers
  • 8282.759 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 9532.956 miles
  • 15341.806 kilometers
  • 8283.913 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Columbus to Bathurst?

The estimated flight time from John Glenn Columbus International Airport to Bathurst Airport is 18 hours and 32 minutes.

Flight carbon footprint between John Glenn Columbus International Airport (CMH) and Bathurst Airport (BHS)

On average, flying from Columbus to Bathurst generates about 1 229 kg of CO2 per passenger, and 1 229 kilograms equals 2 709 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Columbus to Bathurst

See the map of the shortest flight path between John Glenn Columbus International Airport (CMH) and Bathurst Airport (BHS).

Airport information

Origin John Glenn Columbus International Airport
City: Columbus, OH
Country: United States Flag of United States
IATA Code: CMH
ICAO Code: KCMH
Coordinates: 39°59′52″N, 82°53′30″W
Destination Bathurst Airport
City: Bathurst
Country: Australia Flag of Australia
IATA Code: BHS
ICAO Code: YBTH
Coordinates: 33°24′33″S, 149°39′7″E