Air Miles Calculator logo

How far is Stephenville from Brisbane?

The distance between Brisbane (Brisbane Airport) and Stephenville (Stephenville International Airport) is 10202 miles / 16419 kilometers / 8865 nautical miles.

Brisbane Airport – Stephenville International Airport

Distance arrow
10202
Miles
Distance arrow
16419
Kilometers
Distance arrow
8865
Nautical miles
Flight time duration
19 h 48 min
Time Difference
13 h 30 min
CO2 emission
1 333 kg

Search flights

Distance from Brisbane to Stephenville

There are several ways to calculate the distance from Brisbane to Stephenville. Here are two standard methods:

Vincenty's formula (applied above)
  • 10202.219 miles
  • 16418.881 kilometers
  • 8865.486 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 10203.547 miles
  • 16421.017 kilometers
  • 8866.640 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Brisbane to Stephenville?

The estimated flight time from Brisbane Airport to Stephenville International Airport is 19 hours and 48 minutes.

Flight carbon footprint between Brisbane Airport (BNE) and Stephenville International Airport (YJT)

On average, flying from Brisbane to Stephenville generates about 1 333 kg of CO2 per passenger, and 1 333 kilograms equals 2 940 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Brisbane to Stephenville

See the map of the shortest flight path between Brisbane Airport (BNE) and Stephenville International Airport (YJT).

Airport information

Origin Brisbane Airport
City: Brisbane
Country: Australia Flag of Australia
IATA Code: BNE
ICAO Code: YBBN
Coordinates: 27°23′3″S, 153°7′1″E
Destination Stephenville International Airport
City: Stephenville
Country: Canada Flag of Canada
IATA Code: YJT
ICAO Code: CYJT
Coordinates: 48°32′39″N, 58°32′59″W