let me hear your thoughts about this…
I was building an argument here, but I decided I just want to provoke you a little, without giving much of my conclusions.
I might be wrong about this but I have a feeling that, in the past, writers would explore themes that were relevant to their times but a bit uncomfortable to make people think about them, whereas today, writers are finding themselves kinda forced to write about themes that mirror the thoughts of our current society in order to sell their books or their content.
What do you say?
I think I’d rather write a story that gives people what they didn’t ask for (in a positive way), and provoke them or show them a new perspective somehow, than just giving them what they want, to please them, and help my publisher sell my books or my content to get a lot of hearts.
Or, I don’t know, maybe we have to balance both not to starve as writers in the twenty-first century?
What are your thoughts about it?
I wanna hear you.