Is it normal to think animals are more important than humans?
Alright first of all I would like to say this humans kill,rape,cheat,steal and insult others ,they start wars,some are racist,some treat you like shit. Do animals hurt our feelings? do they start wars? no they don,t the end story is they are more important.