Society’s views on women were always strict, they always had to earn the perspective of others, and it never came easy. First off women were always overseen by all men but especially by the government. They were overseen because men didn’t think they were important, women were never taken seriously because apparently they didn’t do much to help out society. In all defense looking back it didn’t look like they got a fair chance to even try to help out. The only thing they did was take care of their family and their …show more content…
Society’s views about women and their rights changed for the better of women. As men started treating women equally and fairly. Women’s roles in society changed from just being housewives to getting out into the work force. They were able to contribute to the society by working and getting paid for their work and not being expected to have one job as a housewife. Lastly women obtained many rights throughout this time. Acts were passed to give them freedom and equality and men started treating them better. Finally women got what they fought for. Therese important events that happened in history have been key events that gave us, women rights that we have today. Without these major historic times we would not be as fortunate to have the significant rights we have