Air Miles Calculator logo

How far is Gillam from Newburgh, NY?

The distance between Newburgh (Stewart International Airport) and Gillam (Gillam Airport) is 1380 miles / 2221 kilometers / 1199 nautical miles.

The driving distance from Newburgh (SWF) to Gillam (YGX) is 2320 miles / 3734 kilometers, and travel time by car is about 46 hours 46 minutes.

Stewart International Airport – Gillam Airport

Distance arrow
1380
Miles
Distance arrow
2221
Kilometers
Distance arrow
1199
Nautical miles

Search flights

Distance from Newburgh to Gillam

There are several ways to calculate the distance from Newburgh to Gillam. Here are two standard methods:

Vincenty's formula (applied above)
  • 1379.760 miles
  • 2220.509 kilometers
  • 1198.979 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1377.829 miles
  • 2217.402 kilometers
  • 1197.301 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Newburgh to Gillam?

The estimated flight time from Stewart International Airport to Gillam Airport is 3 hours and 6 minutes.

Flight carbon footprint between Stewart International Airport (SWF) and Gillam Airport (YGX)

On average, flying from Newburgh to Gillam generates about 172 kg of CO2 per passenger, and 172 kilograms equals 379 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Newburgh to Gillam

See the map of the shortest flight path between Stewart International Airport (SWF) and Gillam Airport (YGX).

Airport information

Origin Stewart International Airport
City: Newburgh, NY
Country: United States Flag of United States
IATA Code: SWF
ICAO Code: KSWF
Coordinates: 41°30′14″N, 74°6′17″W
Destination Gillam Airport
City: Gillam
Country: Canada Flag of Canada
IATA Code: YGX
ICAO Code: CYGX
Coordinates: 56°21′26″N, 94°42′38″W