How far is Springfield, IL, from Cambridge Bay?
The distance between Cambridge Bay (Cambridge Bay Airport) and Springfield (Springfield Abraham Lincoln Capital Airport) is 2103 miles / 3385 kilometers / 1828 nautical miles.
Cambridge Bay Airport – Springfield Abraham Lincoln Capital Airport
Search flights
Distance from Cambridge Bay to Springfield
There are several ways to calculate the distance from Cambridge Bay to Springfield. Here are two standard methods:
Vincenty's formula (applied above)- 2103.229 miles
- 3384.819 kilometers
- 1827.656 nautical miles
Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.
Haversine formula- 2100.785 miles
- 3380.886 kilometers
- 1825.532 nautical miles
The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).
How long does it take to fly from Cambridge Bay to Springfield?
The estimated flight time from Cambridge Bay Airport to Springfield Abraham Lincoln Capital Airport is 4 hours and 28 minutes.
What is the time difference between Cambridge Bay and Springfield?
Flight carbon footprint between Cambridge Bay Airport (YCB) and Springfield Abraham Lincoln Capital Airport (SPI)
On average, flying from Cambridge Bay to Springfield generates about 229 kg of CO2 per passenger, and 229 kilograms equals 505 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.
Map of flight path from Cambridge Bay to Springfield
See the map of the shortest flight path between Cambridge Bay Airport (YCB) and Springfield Abraham Lincoln Capital Airport (SPI).
Airport information
Origin | Cambridge Bay Airport |
---|---|
City: | Cambridge Bay |
Country: | Canada |
IATA Code: | YCB |
ICAO Code: | CYCB |
Coordinates: | 69°6′29″N, 105°8′16″W |
Destination | Springfield Abraham Lincoln Capital Airport |
---|---|
City: | Springfield, IL |
Country: | United States |
IATA Code: | SPI |
ICAO Code: | KSPI |
Coordinates: | 39°50′38″N, 89°40′40″W |