Air Miles Calculator logo

How far is Wekweètì from Fargo, ND?

The distance between Fargo (Hector International Airport) and Wekweètì (Wekweètì Airport) is 1363 miles / 2193 kilometers / 1184 nautical miles.

The driving distance from Fargo (FAR) to Wekweètì (YFJ) is 2085 miles / 3356 kilometers, and travel time by car is about 43 hours 59 minutes.

Hector International Airport – Wekweètì Airport

Distance arrow
1363
Miles
Distance arrow
2193
Kilometers
Distance arrow
1184
Nautical miles

Search flights

Distance from Fargo to Wekweètì

There are several ways to calculate the distance from Fargo to Wekweètì. Here are two standard methods:

Vincenty's formula (applied above)
  • 1362.856 miles
  • 2193.305 kilometers
  • 1184.290 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1360.523 miles
  • 2189.550 kilometers
  • 1182.263 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Fargo to Wekweètì?

The estimated flight time from Hector International Airport to Wekweètì Airport is 3 hours and 4 minutes.

Flight carbon footprint between Hector International Airport (FAR) and Wekweètì Airport (YFJ)

On average, flying from Fargo to Wekweètì generates about 171 kg of CO2 per passenger, and 171 kilograms equals 377 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Fargo to Wekweètì

See the map of the shortest flight path between Hector International Airport (FAR) and Wekweètì Airport (YFJ).

Airport information

Origin Hector International Airport
City: Fargo, ND
Country: United States Flag of United States
IATA Code: FAR
ICAO Code: KFAR
Coordinates: 46°55′14″N, 96°48′56″W
Destination Wekweètì Airport
City: Wekweètì
Country: Canada Flag of Canada
IATA Code: YFJ
ICAO Code: CYWE
Coordinates: 64°11′26″N, 114°4′37″W