Air Miles Calculator logo

How far is Prince George from North Platte, NE?

The distance between North Platte (North Platte Regional Airport) and Prince George (Prince George Airport) is 1345 miles / 2165 kilometers / 1169 nautical miles.

The driving distance from North Platte (LBF) to Prince George (YXS) is 1698 miles / 2733 kilometers, and travel time by car is about 33 hours 5 minutes.

North Platte Regional Airport – Prince George Airport

Distance arrow
1345
Miles
Distance arrow
2165
Kilometers
Distance arrow
1169
Nautical miles

Search flights

Distance from North Platte to Prince George

There are several ways to calculate the distance from North Platte to Prince George. Here are two standard methods:

Vincenty's formula (applied above)
  • 1345.459 miles
  • 2165.307 kilometers
  • 1169.172 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 1343.295 miles
  • 2161.824 kilometers
  • 1167.292 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from North Platte to Prince George?

The estimated flight time from North Platte Regional Airport to Prince George Airport is 3 hours and 2 minutes.

Flight carbon footprint between North Platte Regional Airport (LBF) and Prince George Airport (YXS)

On average, flying from North Platte to Prince George generates about 170 kg of CO2 per passenger, and 170 kilograms equals 374 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from North Platte to Prince George

See the map of the shortest flight path between North Platte Regional Airport (LBF) and Prince George Airport (YXS).

Airport information

Origin North Platte Regional Airport
City: North Platte, NE
Country: United States Flag of United States
IATA Code: LBF
ICAO Code: KLBF
Coordinates: 41°7′34″N, 100°41′2″W
Destination Prince George Airport
City: Prince George
Country: Canada Flag of Canada
IATA Code: YXS
ICAO Code: CYXS
Coordinates: 53°53′21″N, 122°40′44″W