Air Miles Calculator logo

How far is Bijie from São Paulo?

The distance between São Paulo (São Paulo–Guarulhos International Airport) and Bijie (Bijie Feixiong Airport) is 10676 miles / 17181 kilometers / 9277 nautical miles.

São Paulo–Guarulhos International Airport – Bijie Feixiong Airport

Distance arrow
10676
Miles
Distance arrow
17181
Kilometers
Distance arrow
9277
Nautical miles
Flight time duration
20 h 42 min
CO2 emission
1 409 kg

Search flights

Distance from São Paulo to Bijie

There are several ways to calculate the distance from São Paulo to Bijie. Here are two standard methods:

Vincenty's formula (applied above)
  • 10675.646 miles
  • 17180.787 kilometers
  • 9276.883 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 10668.966 miles
  • 17170.037 kilometers
  • 9271.078 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from São Paulo to Bijie?

The estimated flight time from São Paulo–Guarulhos International Airport to Bijie Feixiong Airport is 20 hours and 42 minutes.

Flight carbon footprint between São Paulo–Guarulhos International Airport (GRU) and Bijie Feixiong Airport (BFJ)

On average, flying from São Paulo to Bijie generates about 1 409 kg of CO2 per passenger, and 1 409 kilograms equals 3 106 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from São Paulo to Bijie

See the map of the shortest flight path between São Paulo–Guarulhos International Airport (GRU) and Bijie Feixiong Airport (BFJ).

Airport information

Origin São Paulo–Guarulhos International Airport
City: São Paulo
Country: Brazil Flag of Brazil
IATA Code: GRU
ICAO Code: SBGR
Coordinates: 23°26′8″S, 46°28′23″W
Destination Bijie Feixiong Airport
City: Bijie
Country: China Flag of China
IATA Code: BFJ
ICAO Code: ZUBJ
Coordinates: 27°16′1″N, 105°28′19″E