Air Miles Calculator logo

How far is Brochet from Alpena, MI?

The distance between Alpena (Alpena County Regional Airport) and Brochet (Brochet Airport) is 1174 miles / 1889 kilometers / 1020 nautical miles.

The driving distance from Alpena (APN) to Brochet (YBT) is 1644 miles / 2646 kilometers, and travel time by car is about 36 hours 25 minutes.

Alpena County Regional Airport – Brochet Airport

Distance arrow
1174
Miles
Distance arrow
1889
Kilometers
Distance arrow
1020
Nautical miles

Search flights

Distance from Alpena to Brochet

There are several ways to calculate the distance from Alpena to Brochet. Here are two standard methods:

Vincenty's formula (applied above)
  • 1173.990 miles
  • 1889.354 kilometers
  • 1020.169 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1172.030 miles
  • 1886.200 kilometers
  • 1018.467 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Alpena to Brochet?

The estimated flight time from Alpena County Regional Airport to Brochet Airport is 2 hours and 43 minutes.

Flight carbon footprint between Alpena County Regional Airport (APN) and Brochet Airport (YBT)

On average, flying from Alpena to Brochet generates about 160 kg of CO2 per passenger, and 160 kilograms equals 353 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Alpena to Brochet

See the map of the shortest flight path between Alpena County Regional Airport (APN) and Brochet Airport (YBT).

Airport information

Origin Alpena County Regional Airport
City: Alpena, MI
Country: United States Flag of United States
IATA Code: APN
ICAO Code: KAPN
Coordinates: 45°4′41″N, 83°33′37″W
Destination Brochet Airport
City: Brochet
Country: Canada Flag of Canada
IATA Code: YBT
ICAO Code: CYBT
Coordinates: 57°53′21″N, 101°40′44″W