So Mrs. HR and I are playing some catch up and have started watching Breaking Bad.

We are a few episodes into season 2 and have came away a little disappointed thus far (with the exception of Tuco getting whacked).

Should we stay with it? Nothing irks us more than horrible TV / film and if it's not worth it, we're going to move on to another show.

BTW, no spoilers please ....