Air Miles Calculator logo

How far is Lutselk'e from Baltimore, MD?

The distance between Baltimore (Baltimore–Washington International Airport) and Lutselk'e (Lutselk'e Airport) is 2143 miles / 3449 kilometers / 1862 nautical miles.

The driving distance from Baltimore (BWI) to Lutselk'e (YSG) is 3498 miles / 5630 kilometers, and travel time by car is about 68 hours 36 minutes.

Baltimore–Washington International Airport – Lutselk'e Airport

Distance arrow
2143
Miles
Distance arrow
3449
Kilometers
Distance arrow
1862
Nautical miles

Search flights

Distance from Baltimore to Lutselk'e

There are several ways to calculate the distance from Baltimore to Lutselk'e. Here are two standard methods:

Vincenty's formula (applied above)
  • 2143.268 miles
  • 3449.256 kilometers
  • 1862.449 nautical miles

Vincenty's formula calculates the distance between latitude/longitude points on the earth's surface using an ellipsoidal model of the planet.

Haversine formula
  • 2139.894 miles
  • 3443.825 kilometers
  • 1859.517 nautical miles

The haversine formula calculates the distance between latitude/longitude points assuming a spherical earth (great-circle distance – the shortest distance between two points).

How long does it take to fly from Baltimore to Lutselk'e?

The estimated flight time from Baltimore–Washington International Airport to Lutselk'e Airport is 4 hours and 33 minutes.

What is the time difference between Baltimore and Lutselk'e?

There is no time difference between Baltimore and Lutselk'e.

Flight carbon footprint between Baltimore–Washington International Airport (BWI) and Lutselk'e Airport (YSG)

On average, flying from Baltimore to Lutselk'e generates about 234 kg of CO2 per passenger, and 234 kilograms equals 516 pounds (lbs). The figures are estimates and include only the CO2 generated by burning jet fuel.

Map of flight path and driving directions from Baltimore to Lutselk'e

See the map of the shortest flight path between Baltimore–Washington International Airport (BWI) and Lutselk'e Airport (YSG).

Airport information

Origin Baltimore–Washington International Airport
City: Baltimore, MD
Country: United States Flag of United States
IATA Code: BWI
ICAO Code: KBWI
Coordinates: 39°10′31″N, 76°40′5″W
Destination Lutselk'e Airport
City: Lutselk'e
Country: Canada Flag of Canada
IATA Code: YSG
ICAO Code: CYLK
Coordinates: 62°25′5″N, 110°40′55″W