Air Miles Calculator logo

How far is Bijie from Curitiba?

The distance between Curitiba (Curitiba Afonso Pena International Airport) and Bijie (Bijie Feixiong Airport) is 10873 miles / 17498 kilometers / 9448 nautical miles.

Curitiba Afonso Pena International Airport – Bijie Feixiong Airport

Distance arrow
10873
Miles
Distance arrow
17498
Kilometers
Distance arrow
9448
Nautical miles
Flight time duration
21 h 5 min
CO2 emission
1 441 kg

Search flights

Distance from Curitiba to Bijie

There are several ways to calculate the distance from Curitiba to Bijie. Here are two standard methods:

Vincenty's formula (applied above)
  • 10872.674 miles
  • 17497.872 kilometers
  • 9448.095 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 10865.875 miles
  • 17486.930 kilometers
  • 9442.187 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Curitiba to Bijie?

The estimated flight time from Curitiba Afonso Pena International Airport to Bijie Feixiong Airport is 21 hours and 5 minutes.

Flight carbon footprint between Curitiba Afonso Pena International Airport (CWB) and Bijie Feixiong Airport (BFJ)

On average, flying from Curitiba to Bijie generates about 1 441 kg of CO2 per passenger, and 1 441 kilograms equals 3 176 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Curitiba to Bijie

See the map of the shortest flight path between Curitiba Afonso Pena International Airport (CWB) and Bijie Feixiong Airport (BFJ).

Airport information

Origin Curitiba Afonso Pena International Airport
City: Curitiba
Country: Brazil Flag of Brazil
IATA Code: CWB
ICAO Code: SBCT
Coordinates: 25°31′42″S, 49°10′32″W
Destination Bijie Feixiong Airport
City: Bijie
Country: China Flag of China
IATA Code: BFJ
ICAO Code: ZUBJ
Coordinates: 27°16′1″N, 105°28′19″E