Are reviews still necessary? What I mean that is do we as consumers still follow them, do we actually use them and buy in to what they say or do we ignore them now. I’ve run into several situations lately where I find what the review is telling me is at odds with what I’m seeing. A show that’s reviewed to be bad by “professionals” is seen a great by the audience. A bad movie is actually no where near as bad. That business has a way better attitude than perceived.
I find I’m either ignoring the reviews of products entirely or if I feel I really need an opinion I’m trusting my friends. They’re my friends, our interests mostly lineup, so their opinions mean way more in the long run.