Found 1 items, similar to West Germany.
English → English (WordNet)
Definition: West Germany
West Germany
n : a republic in north central Europe on the North Sea;
established in 1949 from the zones of Germany occupied by
the British and French and Americans after the German
defeat; reunified with East Germany in 1990 [syn:
Federal Republic of Germany
]