Air Miles Calculator logo

How far is Wekweètì from Flint, MI?

The distance between Flint (Bishop International Airport) and Wekweètì (Wekweètì Airport) is 1890 miles / 3042 kilometers / 1643 nautical miles.

The driving distance from Flint (FNT) to Wekweètì (YFJ) is 2888 miles / 4648 kilometers, and travel time by car is about 59 hours 42 minutes.

Bishop International Airport – Wekweètì Airport

Distance arrow
1890
Miles
Distance arrow
3042
Kilometers
Distance arrow
1643
Nautical miles

Search flights

Distance from Flint to Wekweètì

There are several ways to calculate the distance from Flint to Wekweètì. Here are two standard methods:

Vincenty's formula (applied above)
  • 1890.269 miles
  • 3042.093 kilometers
  • 1642.599 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1886.844 miles
  • 3036.581 kilometers
  • 1639.623 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Flint to Wekweètì?

The estimated flight time from Bishop International Airport to Wekweètì Airport is 4 hours and 4 minutes.

Flight carbon footprint between Bishop International Airport (FNT) and Wekweètì Airport (YFJ)

On average, flying from Flint to Wekweètì generates about 207 kg of CO2 per passenger, and 207 kilograms equals 457 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Flint to Wekweètì

See the map of the shortest flight path between Bishop International Airport (FNT) and Wekweètì Airport (YFJ).

Airport information

Origin Bishop International Airport
City: Flint, MI
Country: United States Flag of United States
IATA Code: FNT
ICAO Code: KFNT
Coordinates: 42°57′55″N, 83°44′36″W
Destination Wekweètì Airport
City: Wekweètì
Country: Canada Flag of Canada
IATA Code: YFJ
ICAO Code: CYWE
Coordinates: 64°11′26″N, 114°4′37″W