This week, I read an interesting article that discussed the concept women in relation to maternity leave. The article mentions that the United States is the only nation that does not provide paid maternity leave to women. This notion is backed behind the belief that someone in the family should be always home caring for the baby -- and that person is the woman. However, companies are providing less and less time for women to be outside of work to do this exact phenomenon. To add to that, the article mentions society's myth that working is optional -- however, in today's world, working and earning money is a necessity. When the two are put together, women needing to care for children at home as well as working to earn money, women are placed in a dual role that is difficult to manage. Thus, I wonder how this concept of unpaid, and very little of it at that, maternity leave in combination with the necessity to earn an income came to be a societal norm/expectation of women.