Air Miles Calculator logo

How far is Philadelphia, PA, from Tofino?

The distance between Tofino (Tofino/Long Beach Airport) and Philadelphia (Wings Field) is 2517 miles / 4051 kilometers / 2187 nautical miles.

The driving distance from Tofino (YAZ) to Philadelphia (BBX) is 3089 miles / 4972 kilometers, and travel time by car is about 57 hours 41 minutes.

Tofino/Long Beach Airport – Wings Field

Distance arrow
2517
Miles
Distance arrow
4051
Kilometers
Distance arrow
2187
Nautical miles

Search flights

Distance from Tofino to Philadelphia

There are several ways to calculate the distance from Tofino to Philadelphia. Here are two standard methods:

Vincenty's formula (applied above)
  • 2517.117 miles
  • 4050.908 kilometers
  • 2187.315 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2510.688 miles
  • 4040.561 kilometers
  • 2181.729 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Tofino to Philadelphia?

The estimated flight time from Tofino/Long Beach Airport to Wings Field is 5 hours and 15 minutes.

Flight carbon footprint between Tofino/Long Beach Airport (YAZ) and Wings Field (BBX)

On average, flying from Tofino to Philadelphia generates about 277 kg of CO2 per passenger, and 277 kilograms equals 611 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Tofino to Philadelphia

See the map of the shortest flight path between Tofino/Long Beach Airport (YAZ) and Wings Field (BBX).

Airport information

Origin Tofino/Long Beach Airport
City: Tofino
Country: Canada Flag of Canada
IATA Code: YAZ
ICAO Code: CYAZ
Coordinates: 49°4′47″N, 125°46′32″W
Destination Wings Field
City: Philadelphia, PA
Country: United States Flag of United States
IATA Code: BBX
ICAO Code: KLOM
Coordinates: 40°8′15″N, 75°15′54″W