Air Miles Calculator logo

How far is Monbetsu from Peoria, IL?

The distance between Peoria (General Wayne A. Downing Peoria International Airport) and Monbetsu (Monbetsu Airport) is 5720 miles / 9206 kilometers / 4971 nautical miles.

General Wayne A. Downing Peoria International Airport – Monbetsu Airport

Distance arrow
5720
Miles
Distance arrow
9206
Kilometers
Distance arrow
4971
Nautical miles

Search flights

Distance from Peoria to Monbetsu

There are several ways to calculate the distance from Peoria to Monbetsu. Here are two standard methods:

Vincenty's formula (applied above)
  • 5720.156 miles
  • 9205.698 kilometers
  • 4970.680 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 5705.583 miles
  • 9182.245 kilometers
  • 4958.016 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Peoria to Monbetsu?

The estimated flight time from General Wayne A. Downing Peoria International Airport to Monbetsu Airport is 11 hours and 19 minutes.

Flight carbon footprint between General Wayne A. Downing Peoria International Airport (PIA) and Monbetsu Airport (MBE)

On average, flying from Peoria to Monbetsu generates about 679 kg of CO2 per passenger, and 679 kilograms equals 1 497 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path from Peoria to Monbetsu

See the map of the shortest flight path between General Wayne A. Downing Peoria International Airport (PIA) and Monbetsu Airport (MBE).

Airport information

Origin General Wayne A. Downing Peoria International Airport
City: Peoria, IL
Country: United States Flag of United States
IATA Code: PIA
ICAO Code: KPIA
Coordinates: 40°39′51″N, 89°41′35″W
Destination Monbetsu Airport
City: Monbetsu
Country: Japan Flag of Japan
IATA Code: MBE
ICAO Code: RJEB
Coordinates: 44°18′14″N, 143°24′14″E