Air Miles Calculator logo

How far is Bijie from Hebron, KY?

The distance between Hebron (Cincinnati/Northern Kentucky International Airport) and Bijie (Bijie Feixiong Airport) is 7822 miles / 12588 kilometers / 6797 nautical miles.

Cincinnati/Northern Kentucky International Airport – Bijie Feixiong Airport

Distance arrow
7822
Miles
Distance arrow
12588
Kilometers
Distance arrow
6797
Nautical miles

Search flights

Distance from Hebron to Bijie

There are several ways to calculate the distance from Hebron to Bijie. Here are two standard methods:

Vincenty's formula (applied above)
  • 7821.823 miles
  • 12588.003 kilometers
  • 6796.978 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 7808.335 miles
  • 12566.296 kilometers
  • 6785.257 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Hebron to Bijie?

The estimated flight time from Cincinnati/Northern Kentucky International Airport to Bijie Feixiong Airport is 15 hours and 18 minutes.

Flight carbon footprint between Cincinnati/Northern Kentucky International Airport (CVG) and Bijie Feixiong Airport (BFJ)

On average, flying from Hebron to Bijie generates about 972 kg of CO2 per passenger, and 972 kilograms equals 2 144 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Hebron to Bijie

See the map of the shortest flight path between Cincinnati/Northern Kentucky International Airport (CVG) and Bijie Feixiong Airport (BFJ).

Airport information

Origin Cincinnati/Northern Kentucky International Airport
City: Hebron, KY
Country: United States Flag of United States
IATA Code: CVG
ICAO Code: KCVG
Coordinates: 39°2′55″N, 84°40′4″W
Destination Bijie Feixiong Airport
City: Bijie
Country: China Flag of China
IATA Code: BFJ
ICAO Code: ZUBJ
Coordinates: 27°16′1″N, 105°28′19″E