Definify.com

Definition 2024


West_Germany

West Germany

English

Proper noun

West Germany

  1. (historical, 19491990) The Federal Republic of Germany, distinguished from the German Democratic Republic ("East Germany").
  2. (since 1990) The former areas of the Republic during that time, distinguished from the former East German areas.
  3. (historical, uncommon, 19451949) A collective name for the British-, French-, and American-occupied zones of Germany, distinguished from the Soviet-occupied zone.

Synonyms

Derived terms

Related terms

Translations

References

  1. Oxford English Dictionary, 3rd ed. "West German, adj. and n." Oxford University Press (Oxford), 2012.