What does feminism mean to you? Over the years, the definition of the word “feminism” has changed. For the record, that definition, according to Webster’s Dictionary, is: “the belief that men and women should have equal opportunities.” That seems simple enough, but for some, feminism has become a controversial—even unnecessary…
Early Bird BooksJanuary 23, 2018