How far is London from Newburgh, NY?
The distance between Newburgh (Stewart International Airport) and London (London International Airport) is 376 miles / 606 kilometers / 327 nautical miles.
The driving distance from Newburgh (SWF) to London (YXU) is 469 miles / 754 kilometers, and travel time by car is about 9 hours 54 minutes.
Stewart International Airport – London International Airport
Search flights
Distance from Newburgh to London
There are several ways to calculate the distance from Newburgh to London. Here are two standard methods:
Vincenty's formula (applied above)- 376.366 miles
- 605.703 kilometers
- 327.054 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 375.486 miles
- 604.285 kilometers
- 326.288 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Newburgh to London?
The estimated flight time from Stewart International Airport to London International Airport is 1 hour and 12 minutes.
What is the time difference between Newburgh and London?
Flight carbon footprint between Stewart International Airport (SWF) and London International Airport (YXU)
On average, flying from Newburgh to London generates about 80 kg of CO2 per passenger, and 80 kilograms equals 177 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path and driving directions from Newburgh to London
See the map of the shortest flight path between Stewart International Airport (SWF) and London International Airport (YXU).
Airport information
Origin | Stewart International Airport |
---|---|
City: | Newburgh, NY |
Country: | United States |
IATA Code: | SWF |
ICAO Code: | KSWF |
Coordinates: | 41°30′14″N, 74°6′17″W |
Destination | London International Airport |
---|---|
City: | London |
Country: | Canada |
IATA Code: | YXU |
ICAO Code: | CYXU |
Coordinates: | 43°2′8″N, 81°9′14″W |