I could see in the future (very soon) where some of the repetitive moderation tasks could be automated. For example, I think spammers (even human ones) are not all that creative. Especially when content and links are non-congruent to the overall community.
However, I find it may get harder when it comes to empathy or understanding the context of humour, trolling and sarcasm. For this reason, I think we will need some form of human quality control for the foreseeable future.