Winter

What is Winter?

Winter is (noun) the coldest season of the year, the season between autumn and spring In some countries winter usually means snow. It’s too cold to do any gardening in the winter. We’re taking a winter holiday in Mexico. (verb) to spend the winter in a place These birds normally winter in Southern Portugal.

 

source: Easier English, Student Dictionary Upper Intermediate Level