Air Miles Calculator logo

How far is Bagotville from Abilene, TX?

The distance between Abilene (Abilene Regional Airport) and Bagotville (CFB Bagotville) is 1852 miles / 2981 kilometers / 1610 nautical miles.

The driving distance from Abilene (ABI) to Bagotville (YBG) is 2162 miles / 3479 kilometers, and travel time by car is about 41 hours 32 minutes.

Abilene Regional Airport – CFB Bagotville

Distance arrow
1852
Miles
Distance arrow
2981
Kilometers
Distance arrow
1610
Nautical miles

Search flights

Distance from Abilene to Bagotville

There are several ways to calculate the distance from Abilene to Bagotville. Here are two standard methods:

Vincenty's formula (applied above)
  • 1852.257 miles
  • 2980.919 kilometers
  • 1609.567 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1850.143 miles
  • 2977.516 kilometers
  • 1607.730 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Abilene to Bagotville?

The estimated flight time from Abilene Regional Airport to CFB Bagotville is 4 hours and 0 minutes.

Flight carbon footprint between Abilene Regional Airport (ABI) and CFB Bagotville (YBG)

On average, flying from Abilene to Bagotville generates about 204 kg of CO2 per passenger, and 204 kilograms equals 450 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Abilene to Bagotville

See the map of the shortest flight path between Abilene Regional Airport (ABI) and CFB Bagotville (YBG).

Airport information

Origin Abilene Regional Airport
City: Abilene, TX
Country: United States Flag of United States
IATA Code: ABI
ICAO Code: KABI
Coordinates: 32°24′40″N, 99°40′54″W
Destination CFB Bagotville
City: Bagotville
Country: Canada Flag of Canada
IATA Code: YBG
ICAO Code: CYBG
Coordinates: 48°19′50″N, 70°59′47″W