Home / Dictionary / West germany

West germany Moderate

West germany has 2 different meanings across 2 categories:

Noun · Proper Noun

Definitions
Noun
1

a republic in north central Europe on the North Sea; established in 1949 from the zones of Germany occupied by the British and French and Americans after the German defeat; reunified with East Germany in 1990