Air Miles Calculator logo

How far is Wekweètì from Bangor, ME?

The distance between Bangor (Bangor International Airport) and Wekweètì (Wekweètì Airport) is 2187 miles / 3520 kilometers / 1901 nautical miles.

The driving distance from Bangor (BGR) to Wekweètì (YFJ) is 3570 miles / 5746 kilometers, and travel time by car is about 77 hours 7 minutes.

Bangor International Airport – Wekweètì Airport

Distance arrow
2187
Miles
Distance arrow
3520
Kilometers
Distance arrow
1901
Nautical miles

Search flights

Distance from Bangor to Wekweètì

There are several ways to calculate the distance from Bangor to Wekweètì. Here are two standard methods:

Vincenty's formula (applied above)
  • 2187.409 miles
  • 3520.293 kilometers
  • 1900.806 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2182.074 miles
  • 3511.707 kilometers
  • 1896.170 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bangor to Wekweètì?

The estimated flight time from Bangor International Airport to Wekweètì Airport is 4 hours and 38 minutes.

Flight carbon footprint between Bangor International Airport (BGR) and Wekweètì Airport (YFJ)

On average, flying from Bangor to Wekweètì generates about 239 kg of CO2 per passenger, and 239 kilograms equals 527 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bangor to Wekweètì

See the map of the shortest flight path between Bangor International Airport (BGR) and Wekweètì Airport (YFJ).

Airport information

Origin Bangor International Airport
City: Bangor, ME
Country: United States Flag of United States
IATA Code: BGR
ICAO Code: KBGR
Coordinates: 44°48′26″N, 68°49′41″W
Destination Wekweètì Airport
City: Wekweètì
Country: Canada Flag of Canada
IATA Code: YFJ
ICAO Code: CYWE
Coordinates: 64°11′26″N, 114°4′37″W