Air Miles Calculator logo

How far is Brochet from Nantucket, MA?

The distance between Nantucket (Nantucket Memorial Airport) and Brochet (Brochet Airport) is 1800 miles / 2896 kilometers / 1564 nautical miles.

The driving distance from Nantucket (ACK) to Brochet (YBT) is 2573 miles / 4141 kilometers, and travel time by car is about 55 hours 1 minutes.

Nantucket Memorial Airport – Brochet Airport

Distance arrow
1800
Miles
Distance arrow
2896
Kilometers
Distance arrow
1564
Nautical miles

Search flights

Distance from Nantucket to Brochet

There are several ways to calculate the distance from Nantucket to Brochet. Here are two standard methods:

Vincenty's formula (applied above)
  • 1799.586 miles
  • 2896.153 kilometers
  • 1563.797 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1796.204 miles
  • 2890.710 kilometers
  • 1560.858 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Nantucket to Brochet?

The estimated flight time from Nantucket Memorial Airport to Brochet Airport is 3 hours and 54 minutes.

Flight carbon footprint between Nantucket Memorial Airport (ACK) and Brochet Airport (YBT)

On average, flying from Nantucket to Brochet generates about 200 kg of CO2 per passenger, and 200 kilograms equals 441 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Nantucket to Brochet

See the map of the shortest flight path between Nantucket Memorial Airport (ACK) and Brochet Airport (YBT).

Airport information

Origin Nantucket Memorial Airport
City: Nantucket, MA
Country: United States Flag of United States
IATA Code: ACK
ICAO Code: KACK
Coordinates: 41°15′11″N, 70°3′36″W
Destination Brochet Airport
City: Brochet
Country: Canada Flag of Canada
IATA Code: YBT
ICAO Code: CYBT
Coordinates: 57°53′21″N, 101°40′44″W