West hollywood

What is West hollywood?

West hollywood definition and meaning on Dictionary terms:
noun

a city in SW California, near Los Angeles.