Weimar

What is Weimar?

Weimar definition and meaning on Dictionary terms:
noun

a city in Thuringia, in central Germany.