West country

What is West country?

West country definition and meaning on Dictionary terms:
noun
the western portion of the Roman Empire after its division, a.d. 395, which became extinct a.d. 476.

 

reference: www.dictionary.com/browse/west-country

Tags: