Air Miles Calculator logo

How far is Pontes e Lacerda from Toronto?

The distance between Toronto (Toronto Pearson International Airport) and Pontes e Lacerda (Pontes e Lacerda Airport) is 4248 miles / 6837 kilometers / 3692 nautical miles.

Toronto Pearson International Airport – Pontes e Lacerda Airport

Distance arrow
4248
Miles
Distance arrow
6837
Kilometers
Distance arrow
3692
Nautical miles

Search flights

Distance from Toronto to Pontes e Lacerda

There are several ways to calculate the distance from Toronto to Pontes e Lacerda. Here are two standard methods:

Vincenty's formula (applied above)
  • 4248.260 miles
  • 6836.912 kilometers
  • 3691.637 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 4264.134 miles
  • 6862.458 kilometers
  • 3705.431 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Toronto to Pontes e Lacerda?

The estimated flight time from Toronto Pearson International Airport to Pontes e Lacerda Airport is 8 hours and 32 minutes.

Flight carbon footprint between Toronto Pearson International Airport (YYZ) and Pontes e Lacerda Airport (LCB)

On average, flying from Toronto to Pontes e Lacerda generates about 487 kg of CO2 per passenger, and 487 kilograms equals 1 075 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Toronto to Pontes e Lacerda

See the map of the shortest flight path between Toronto Pearson International Airport (YYZ) and Pontes e Lacerda Airport (LCB).

Airport information

Origin Toronto Pearson International Airport
City: Toronto
Country: Canada Flag of Canada
IATA Code: YYZ
ICAO Code: CYYZ
Coordinates: 43°40′37″N, 79°37′50″W
Destination Pontes e Lacerda Airport
City: Pontes e Lacerda
Country: Brazil Flag of Brazil
IATA Code: LCB
ICAO Code: SWBG
Coordinates: 15°11′36″S, 59°23′5″W