My three issues with utilitarianism, in a complex world:
1 - intent counts for nothing
2 - what you feel increases happiness might feel to someone else to do no such thing
3 - how can you accurately determine how much happiness a given action brings about, given the near-infinite number of consequences that result over time from even the smallest intervention in a system
O.
1 - irrelevant.
2 - we can make some reasonably accurate assumptions: murder, rape, dishonesty and cruelty reduce the sum of happiness; honoesty, kindness, generosity increase it.
3 - see 2. Note that I said rule-utilitariansm, in which our acts are according to our rules, and our rules based on utilitarianism. We don't have to weigh up the utilitarian pros and cons of each act before performing it.