Air Miles Calculator logo

How far is Jackson, MS, from Hall Beach?

The distance between Hall Beach (Hall Beach Airport) and Jackson (Jackson–Medgar Wiley Evers International Airport) is 2545 miles / 4095 kilometers / 2211 nautical miles.

Hall Beach Airport – Jackson–Medgar Wiley Evers International Airport

Distance arrow
2545
Miles
Distance arrow
4095
Kilometers
Distance arrow
2211
Nautical miles

Search flights

Distance from Hall Beach to Jackson

There are several ways to calculate the distance from Hall Beach to Jackson. Here are two standard methods:

Vincenty's formula (applied above)
  • 2544.538 miles
  • 4095.037 kilometers
  • 2211.143 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2543.552 miles
  • 4093.450 kilometers
  • 2210.286 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hall Beach to Jackson?

The estimated flight time from Hall Beach Airport to Jackson–Medgar Wiley Evers International Airport is 5 hours and 19 minutes.

Flight carbon footprint between Hall Beach Airport (YUX) and Jackson–Medgar Wiley Evers International Airport (JAN)

On average, flying from Hall Beach to Jackson generates about 280 kg of CO2 per passenger, and 280 kilograms equals 618 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Hall Beach to Jackson

See the map of the shortest flight path between Hall Beach Airport (YUX) and Jackson–Medgar Wiley Evers International Airport (JAN).

Airport information

Origin Hall Beach Airport
City: Hall Beach
Country: Canada Flag of Canada
IATA Code: YUX
ICAO Code: CYUX
Coordinates: 68°46′33″N, 81°14′36″W
Destination Jackson–Medgar Wiley Evers International Airport
City: Jackson, MS
Country: United States Flag of United States
IATA Code: JAN
ICAO Code: KJAN
Coordinates: 32°18′40″N, 90°4′33″W