Air Miles Calculator logo

How far is St. John's from Tête-à-la-Baleine?

The distance between Tête-à-la-Baleine (Tête-à-la-Baleine Airport) and St. John's (St. John's International Airport) is 367 miles / 591 kilometers / 319 nautical miles.

The driving distance from Tête-à-la-Baleine (ZTB) to St. John's (YYT) is 761 miles / 1224 kilometers, and travel time by car is about 61 hours 6 minutes.

Tête-à-la-Baleine Airport – St. John's International Airport

Distance arrow
367
Miles
Distance arrow
591
Kilometers
Distance arrow
319
Nautical miles

Search flights

Distance from Tête-à-la-Baleine to St. John's

There are several ways to calculate the distance from Tête-à-la-Baleine to St. John's. Here are two standard methods:

Vincenty's formula (applied above)
  • 367.164 miles
  • 590.894 kilometers
  • 319.057 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 366.401 miles
  • 589.665 kilometers
  • 318.394 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Tête-à-la-Baleine to St. John's?

The estimated flight time from Tête-à-la-Baleine Airport to St. John's International Airport is 1 hour and 11 minutes.

Flight carbon footprint between Tête-à-la-Baleine Airport (ZTB) and St. John's International Airport (YYT)

On average, flying from Tête-à-la-Baleine to St. John's generates about 79 kg of CO2 per passenger, and 79 kilograms equals 174 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Tête-à-la-Baleine to St. John's

See the map of the shortest flight path between Tête-à-la-Baleine Airport (ZTB) and St. John's International Airport (YYT).

Airport information

Origin Tête-à-la-Baleine Airport
City: Tête-à-la-Baleine
Country: Canada Flag of Canada
IATA Code: ZTB
ICAO Code: CTB6
Coordinates: 50°40′27″N, 59°23′0″W
Destination St. John's International Airport
City: St. John's
Country: Canada Flag of Canada
IATA Code: YYT
ICAO Code: CYYT
Coordinates: 47°37′6″N, 52°45′6″W