How far is Springfield, IL, from Arctic Bay?
The distance between Arctic Bay (Arctic Bay Airport) and Springfield (Springfield Abraham Lincoln Capital Airport) is 2300 miles / 3701 kilometers / 1998 nautical miles.
Arctic Bay Airport – Springfield Abraham Lincoln Capital Airport
Search flights
Distance from Arctic Bay to Springfield
There are several ways to calculate the distance from Arctic Bay to Springfield. Here are two standard methods:
Vincenty's formula (applied above)- 2299.528 miles
- 3700.731 kilometers
- 1998.235 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2296.560 miles
- 3695.954 kilometers
- 1995.656 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Arctic Bay to Springfield?
The estimated flight time from Arctic Bay Airport to Springfield Abraham Lincoln Capital Airport is 4 hours and 51 minutes.
What is the time difference between Arctic Bay and Springfield?
There is no time difference between Arctic Bay and Springfield.
Flight carbon footprint between Arctic Bay Airport (YAB) and Springfield Abraham Lincoln Capital Airport (SPI)
On average, flying from Arctic Bay to Springfield generates about 252 kg of CO2 per passenger, and 252 kilograms equals 555 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Arctic Bay to Springfield
See the map of the shortest flight path between Arctic Bay Airport (YAB) and Springfield Abraham Lincoln Capital Airport (SPI).
Airport information
Origin | Arctic Bay Airport |
---|---|
City: | Arctic Bay |
Country: | Canada |
IATA Code: | YAB |
ICAO Code: | CYAB |
Coordinates: | 73°0′20″N, 85°2′33″W |
Destination | Springfield Abraham Lincoln Capital Airport |
---|---|
City: | Springfield, IL |
Country: | United States |
IATA Code: | SPI |
ICAO Code: | KSPI |
Coordinates: | 39°50′38″N, 89°40′40″W |