Join FunTrivia for Free: Hourly trivia games, quizzes, community, and more!
Fun Trivia
Ask FunTrivia: Questions and Answers
Answers to 100,000 Fascinating Questions
Welcome to FunTrivia's Question & Answer forum!

Search All Questions


Please cite any factual claims with citation links or references from authoritative sources. Editors continuously recheck submissions and claims.

Archived Questions

Goto Qn #


Why does Toronto's Pearson International Airport have the airport code YYZ?

Question #130856. Asked by eyhung.
Last updated May 06 2013.
Originally posted May 06 2013 1:06 AM.

avatar
parrotman2006 star
Answer has 1 vote
parrotman2006 star
18 year member
346 replies avatar

Answer has 1 vote.
The first Y comes from the fact that all Canadian airports have an IATA code beginning with Y. As to YYZ for Toronto, it is possible that it would have been last on the list, and was given to the largest airport. That is pure speculation.

link https://en.wikipedia.org/wiki/List_of_airports_by_IATA_code:_Y

May 06 2013, 1:51 AM
avatar
gtho4 star
Answer has 3 votes
gtho4 star
Moderator
24 year member
2377 replies avatar

Answer has 3 votes.
These two posts on flightsim.com shed a bit more light re why:

Why does Toronto's Pearson Intl Airport have code "YYZ"?
07-12-2003, 03:41 AM #7 Calb

"YZ" was assigned to Pearson long before it was known as Pearson International. Originally, Canadian designators were deliberately assigned so as to give little or no indication of their locale -- the letters chosen being purely arbitrary. There were a few exceptions -- or so it would seem: VR=Vancouver, WG=Winnipeg, OW=Ottawa, BR=Brandon and others. I think this "scrambling" was an attempt at security at the outbreak of WW2. It wasn't until the 70's that we saw widespread use of 3 letters to conform to the (then) ICAO standards. Canada never needed more than 2-letter designators -- there weren't enough airports and navaids to make 3-letters necessary. The U.S. on the other hand has always needed 3-letter designators and they prudently have a high percentage that "make sense", thereby making it much easier to remember. Many you can "decode" if you have at least a little familiarity with the area. The letters "C" and "Y" are taken from the Internationally agreed to list of designators for airports and radio station callsigns (broadcast, television, 2-way, Ham, CB, etc). In the U.S. "W" and "K" are used for radio callsigns but, as we all know, only "K" is used for airport and navaid designators. There may be other aspects to this subject but I think these are the high points.
Cal, CYXX (Abbotsford BC)

07-14-2003, 10:07 PM #18 deltabgjim

Calb, you're absolutely right about the two-letter identifiers in Canada, but the story goes back even farther than WW2. When the first Canadian transcontinental railroads and telegraph lines were built, each station had its own two letter Morse code. VR was Vancouver, TZ Toronto, QB Quebec, WG Winnipeg, SJ St. Johns, YC Calgary, OW Ottawa, EG Edmonton, etc. Over time, the Y was added to the front to indicate an airport rather than a train station and to differentiate from U.S. airports.
SIDEBAR: Similarly, the letter N was reserved as a first letter of an airport identifier in the States to indicate a Naval (or Marine) Air Station, e.g. NBC is MCAS Beaufort, SC; NQX is NAS Key West/Boca Chica, FL. Hence, the civil airports in Newark and Norfolk are EWR and ORF, respectively. Since YTZ was used to indicate Toronto City Airport (on the island in Lake Ontario near downtown), YYZ was selected for Pearson International (made sense to somebody). I'm not sure about Montreal (YUL). Perhaps UL is supposed to bring to mind Dorval, the little town where the airport is (or Hull, which is a couple hundred miles away). As far as the C in the four-letter ICAO code, allow me to explain. There are two coding systems for airports, ICAO (International Civil Aviation Org., a UN agency) and IATA (Int'l Air Transport Assoc., an airline industry group). ICAO codes are used for meteorological, route planning, and ATC. IATA (three letters) are used by airlines and travel agents. All continental U.S. airports start with "K" (KATL, KSFO, KDCA, etc.). Alaskan and Hawaiian airports start with "P". Canadian airports start with "C", etc. Most times, the ICAO and IATA codes are almost the same (KATL = ATL, CYUL = YUL). Many other times, they aren't, even in the U.S. (Hilton Head Island, SC, is KHXD in ICAO but HHH in IATA). Most airports in the UK start with EG. Hence, EGLL for Heathrow would probably spell out as Northern (E)urope, (G)reat Britain, (LL)ondon. Hence Gatwic(k) is EGKK. Across the Channel, similar conventions apply: LIRF is (I)talia (R)oma (F)iumicino, LFPG is (F)rance (P)aris Charles de (G)aulle. But when you book an airline ticket, the IATA codes kick in: LHR, LGW, FCO, CDG. What this boils down to in the sim world is this: if you're in FS2002, you're using ICAO. If you're in Airline Tycoon, you're using IATA.
J.Klotz, Delta Connection/Atlantic Southeast Airlines

link http://www.flightsim.com/vbfs/showthread.php?39690-Why-does-Toronto-s-Pearson-Intl-Airport-have-code-YYZ

May 06 2013, 7:17 AM
avatar
AyatollahK
Answer has 6 votes
Currently Best Answer
AyatollahK
17 year member
713 replies avatar

Answer has 6 votes.

Currently voted the best answer.
I hate myself for answering this, but gtho4's answer taken from flightsim.com is so close to being right that I just want to correct it slightly.

The opening "Y" signifies a Canadian weather reporting station with a land air terminal; in the olden days, a marine air terminal would have been signaled with a "Z". After the end of marine air terminals, Canada repurposed the "Z" to use in cases where the designation might be too close to a foreign designation with a "Y". I think Canada has around 350 "Y"s and a little under 50 "Z"s in its international codes as of right now.

The next two digits were originally the old Morse Code railway station codes for CNR. For example, "YZ" was assigned to the town of Malton (which is where Pearson Airport is actually located, although Malton is now part of Mississauga), so it dates back to the train system. It was then adopted for use by the weather reporting system, which is how it made its way into the airport codes. The code is still used for Malton's radio transmitter beacon, although Malton now uses "ZYZ" to differentiate itself from the airport, as you can see if you go to 368 kHz on the following chart.

link http://www.dxinfocentre.com/ndb.htm (Malton shows as "ZYZ: Toronto -- Malton".)

Here's the town of Malton:
link https://en.wikipedia.org/wiki/Malton,_Ontario

And finally, here's a link to a quote about this from a air traffic control text, which doesn't mention the railroad link to the two-letter codes but seems to be otherwise right:

link http://www.airliners.net/aviation-forums/general_aviation/read.main/2726954/

"Although the naming of Canadian airports and weather stations can seem confusing, here is a brief explanation. Originally, in the 1930's, Canada used two letters for identification of a weather reporting station. Additionally, preceding the 2-letter code, was placed a Y (meaning 'yes') where the reporting station was co-located with an airport, a W (meaning 'without') where the reporting station was not co-located with an airport, and a U where the reporting station was co-located with an NDB. An X was used if the last two letters of the code had already been taken by another Canadian ident, and a Z was used if the locator could be confused with a U.S. three letter ident." (Michael Culhane IFR text, section 2.18 p. 64)

May 06 2013, 3:13 PM
free email trivia FREE! Get a new mixed Fun Trivia quiz each day in your email. It's a fun way to start your day!


arrow Your Email Address:

Sign in or Create Free User ID to participate in the discussion