Air Miles Calculator logo

How far is Wekweètì from Muskegon, MI?

The distance between Muskegon (Muskegon County Airport) and Wekweètì (Wekweètì Airport) is 1818 miles / 2926 kilometers / 1580 nautical miles.

The driving distance from Muskegon (MKG) to Wekweètì (YFJ) is 2745 miles / 4418 kilometers, and travel time by car is about 57 hours 8 minutes.

Muskegon County Airport – Wekweètì Airport

Distance arrow
1818
Miles
Distance arrow
2926
Kilometers
Distance arrow
1580
Nautical miles

Search flights

Distance from Muskegon to Wekweètì

There are several ways to calculate the distance from Muskegon to Wekweètì. Here are two standard methods:

Vincenty's formula (applied above)
  • 1818.438 miles
  • 2926.492 kilometers
  • 1580.179 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1815.272 miles
  • 2921.397 kilometers
  • 1577.428 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Muskegon to Wekweètì?

The estimated flight time from Muskegon County Airport to Wekweètì Airport is 3 hours and 56 minutes.

Flight carbon footprint between Muskegon County Airport (MKG) and Wekweètì Airport (YFJ)

On average, flying from Muskegon to Wekweètì generates about 202 kg of CO2 per passenger, and 202 kilograms equals 444 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Muskegon to Wekweètì

See the map of the shortest flight path between Muskegon County Airport (MKG) and Wekweètì Airport (YFJ).

Airport information

Origin Muskegon County Airport
City: Muskegon, MI
Country: United States Flag of United States
IATA Code: MKG
ICAO Code: KMKG
Coordinates: 43°10′10″N, 86°14′17″W
Destination Wekweètì Airport
City: Wekweètì
Country: Canada Flag of Canada
IATA Code: YFJ
ICAO Code: CYWE
Coordinates: 64°11′26″N, 114°4′37″W