I mean that’s pretty much how all modern cultural views happened. The British came in and changed shit and when they left not everything changed back and a lot of cultural norms remained.
Edit: Idk anything about middle eastern history (thanks American education system!) I just thought it made sense since Britain has impacted cultures all over the world due to its violent colonialism.