Air Miles Calculator logo

How far is Dayong from Maiduguri?

The distance between Maiduguri (Maiduguri International Airport) and Dayong (Zhangjiajie Hehua International Airport) is 6265 miles / 10082 kilometers / 5444 nautical miles.

Maiduguri International Airport – Zhangjiajie Hehua International Airport

Distance arrow
6265
Miles
Distance arrow
10082
Kilometers
Distance arrow
5444
Nautical miles

Search flights

Distance from Maiduguri to Dayong

There are several ways to calculate the distance from Maiduguri to Dayong. Here are two standard methods:

Vincenty's formula (applied above)
  • 6264.707 miles
  • 10082.069 kilometers
  • 5443.882 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 6256.617 miles
  • 10069.049 kilometers
  • 5436.852 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Maiduguri to Dayong?

The estimated flight time from Maiduguri International Airport to Zhangjiajie Hehua International Airport is 12 hours and 21 minutes.

Flight carbon footprint between Maiduguri International Airport (MIU) and Zhangjiajie Hehua International Airport (DYG)

On average, flying from Maiduguri to Dayong generates about 753 kg of CO2 per passenger, and 753 kilograms equals 1 659 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Maiduguri to Dayong

See the map of the shortest flight path between Maiduguri International Airport (MIU) and Zhangjiajie Hehua International Airport (DYG).

Airport information

Origin Maiduguri International Airport
City: Maiduguri
Country: Nigeria Flag of Nigeria
IATA Code: MIU
ICAO Code: DNMA
Coordinates: 11°51′19″N, 13°4′51″E
Destination Zhangjiajie Hehua International Airport
City: Dayong
Country: China Flag of China
IATA Code: DYG
ICAO Code: ZGDY
Coordinates: 29°6′10″N, 110°26′34″E