I'm watching First Dates and it's a great show and it's just funny seeing how it's implied that men pay on the first date. We're entering into a new era where women are breaking down boundaries that we never really thought about and I'm curious what y'all think about this.
Full disclosure I'm probably not as progressive as most of u on here so I feel like men footing the bill is expected.
Edit: Please don't spoil the show I'm only on ep2
Full disclosure I'm probably not as progressive as most of u on here so I feel like men footing the bill is expected.
Edit: Please don't spoil the show I'm only on ep2