When my son was first born, my husband did not want me to breastfeed. He said he would be embarassed, although I was all hip on doing it. Unfortunately, I just caved in and fed my son a bottle. Nowadays, women in the US are being more open about it, often breastfeeding in public with no cover, and coming under fire. Is this good, or is this bad? Since our culture sexualizes the breast so much, is this why this most natural of human processes is demonized? In other cultures in our world, the breast is viewed primarily as a way to feed the baby, with only limited sexual connotations. Do you think a woman should be covered in public if she chooses to breastfeed, or should she just whip out her breast and feed her child, and piss on the people who have hangups about it? What is your take on it?
CG