Air Miles Calculator logo

How far is Jining from Bojnord?

The distance between Bojnord (Bojnord Airport) and Jining (Jining Qufu Airport) is 3239 miles / 5213 kilometers / 2815 nautical miles.

The driving distance from Bojnord (BJB) to Jining (JNG) is 4046 miles / 6512 kilometers, and travel time by car is about 79 hours 3 minutes.

Bojnord Airport – Jining Qufu Airport

Distance arrow
3239
Miles
Distance arrow
5213
Kilometers
Distance arrow
2815
Nautical miles
Flight time duration
6 h 38 min
Time Difference
4 h 30 min
CO2 emission
363 kg

Search flights

Distance from Bojnord to Jining

There are several ways to calculate the distance from Bojnord to Jining. Here are two standard methods:

Vincenty's formula (applied above)
  • 3239.441 miles
  • 5213.374 kilometers
  • 2814.997 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 3232.053 miles
  • 5201.485 kilometers
  • 2808.577 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Bojnord to Jining?

The estimated flight time from Bojnord Airport to Jining Qufu Airport is 6 hours and 38 minutes.

Flight carbon footprint between Bojnord Airport (BJB) and Jining Qufu Airport (JNG)

On average, flying from Bojnord to Jining generates about 363 kg of CO2 per passenger, and 363 kilograms equals 800 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Bojnord to Jining

See the map of the shortest flight path between Bojnord Airport (BJB) and Jining Qufu Airport (JNG).

Airport information

Origin Bojnord Airport
City: Bojnord
Country: Iran Flag of Iran
IATA Code: BJB
ICAO Code: OIMN
Coordinates: 37°29′34″N, 57°18′29″E
Destination Jining Qufu Airport
City: Jining
Country: China Flag of China
IATA Code: JNG
ICAO Code: ZSJG
Coordinates: 35°17′34″N, 116°20′48″E