Air Miles Calculator logo

How far is Bijie from Haiphong?

The distance between Haiphong (Cat Bi International Airport) and Bijie (Bijie Feixiong Airport) is 451 miles / 725 kilometers / 392 nautical miles.

The driving distance from Haiphong (HPH) to Bijie (BFJ) is 647 miles / 1042 kilometers, and travel time by car is about 12 hours 28 minutes.

Cat Bi International Airport – Bijie Feixiong Airport

Distance arrow
451
Miles
Distance arrow
725
Kilometers
Distance arrow
392
Nautical miles

Search flights

Distance from Haiphong to Bijie

There are several ways to calculate the distance from Haiphong to Bijie. Here are two standard methods:

Vincenty's formula (applied above)
  • 450.745 miles
  • 725.404 kilometers
  • 391.687 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 452.440 miles
  • 728.131 kilometers
  • 393.159 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Haiphong to Bijie?

The estimated flight time from Cat Bi International Airport to Bijie Feixiong Airport is 1 hour and 21 minutes.

Flight carbon footprint between Cat Bi International Airport (HPH) and Bijie Feixiong Airport (BFJ)

On average, flying from Haiphong to Bijie generates about 91 kg of CO2 per passenger, and 91 kilograms equals 201 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Haiphong to Bijie

See the map of the shortest flight path between Cat Bi International Airport (HPH) and Bijie Feixiong Airport (BFJ).

Airport information

Origin Cat Bi International Airport
City: Haiphong
Country: Vietnam Flag of Vietnam
IATA Code: HPH
ICAO Code: VVCI
Coordinates: 20°49′9″N, 106°43′29″E
Destination Bijie Feixiong Airport
City: Bijie
Country: China Flag of China
IATA Code: BFJ
ICAO Code: ZUBJ
Coordinates: 27°16′1″N, 105°28′19″E