Air Miles Calculator logo

How far is Lubbock, TX, from Whale Cove?

The distance between Whale Cove (Whale Cove Airport) and Lubbock (Lubbock Preston Smith International Airport) is 2016 miles / 3244 kilometers / 1752 nautical miles.

The driving distance from Whale Cove (YXN) to Lubbock (LBB) is 2036 miles / 3277 kilometers, and travel time by car is about 41 hours 31 minutes.

Whale Cove Airport – Lubbock Preston Smith International Airport

Distance arrow
2016
Miles
Distance arrow
3244
Kilometers
Distance arrow
1752
Nautical miles

Search flights

Distance from Whale Cove to Lubbock

There are several ways to calculate the distance from Whale Cove to Lubbock. Here are two standard methods:

Vincenty's formula (applied above)
  • 2015.643 miles
  • 3243.864 kilometers
  • 1751.546 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2015.545 miles
  • 3243.705 kilometers
  • 1751.461 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Whale Cove to Lubbock?

The estimated flight time from Whale Cove Airport to Lubbock Preston Smith International Airport is 4 hours and 18 minutes.

What is the time difference between Whale Cove and Lubbock?

There is no time difference between Whale Cove and Lubbock.

Flight carbon footprint between Whale Cove Airport (YXN) and Lubbock Preston Smith International Airport (LBB)

On average, flying from Whale Cove to Lubbock generates about 219 kg of CO2 per passenger, and 219 kilograms equals 484 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Whale Cove to Lubbock

See the map of the shortest flight path between Whale Cove Airport (YXN) and Lubbock Preston Smith International Airport (LBB).

Airport information

Origin Whale Cove Airport
City: Whale Cove
Country: Canada Flag of Canada
IATA Code: YXN
ICAO Code: CYXN
Coordinates: 62°14′24″N, 92°35′53″W
Destination Lubbock Preston Smith International Airport
City: Lubbock, TX
Country: United States Flag of United States
IATA Code: LBB
ICAO Code: KLBB
Coordinates: 33°39′48″N, 101°49′22″W