Facebook is messing with your mind, people… what’s wrong with you? (say that with an accent like an old dude from the sixties).
There is this very real, very public outcry against Facebook (again). It’s just another issue in a long line of things that are related to how they are using, manipulating and publishing the content, that each and every one of us is creating on a daily basis. There are people (and, many of them are much smarter and more informed than I) who are claiming that it is these types of affronts that will, inevitably, lead to their downfall. The latest – as you have no doubt read about in every media space – is that they are no longer just adjusting and throttling the content individuals see based on their highly-secretive algorithm, but that they took a group of people (close to 700,000) and intentionally showed them either positive or negative posts as part of a psychological study to see if this would change their own emotions and the types of content that they share online. Since the news broke, Facebook has explained and apologized for this incident (the incident took place in 2012).
It’s about the ethics.
Consumers feel manipulated. Always. We’re suspect. We see "for sale" signs, knowing full well that the price is probably better online. We see TV commercials with cartoon bubbles cleaning a bathtub, and we know that our tubs will never look that good. We see stores in New York City who have "going out of business" signs in their windows… for decades. We trust car salesmen almost as much as we trust used car salesmen. We live in a world where if we’re unsure, we can ask people on social media and get responses from people we know… and we often trust complete strangers as well (just take a look at TripAdvisor). So, the naysayers who feel more manipulated than usual over Facebook are digging through their archives and, primarily, updating the same stuff that they blogged about when the Facebook Beacon debacle happened in 2007 – 2009. They’re warning the world (primarily through Facebook) that these types of activities are embedded in the corporate culture, because the social network continually stumbles when it comes to issues of terms and conditions in relation to privacy. They’re complaining that because of Facebook’s continued growth and dominance, that Facebook is now arrogant and has a complete lack of respect for its customers.
Ask yourself this: who is Facebook’s customer?
Upon reflection, I don’t think that Facebook’s customer is the person who has a profile that is connected to people on their social network. Facebook went public on Friday, May 18th, 2012 and – on that moment in time – Facebook’s customer became Madison Avenue (not Wall Street). As Facebook attempts to refine its business model and create something that advertisers are willing to pay a premium for, we will continue to see instances like this. And, for the record, I’m not willing to choose sides (just yet). You can’t say that this is like a magazine or newspaper, either. Most magazine and newspapers charge per issue (or have subscribers that pay a monthly fee) and they have advertisers and sponsors. Those types of media companies have to serve two masters. You could argue that without the people using the service for free, that Facebook has no business to sell advertisers. Fine. Fair point, but people are not leaving in droves because they’re getting it for free. They’re putting up with it. If any individual is so up in arms about this (or any past or future incidents like it on Facebook), they do have a choice: they can delete their profile. They can remove themselves from the experience. Some might argue that if enough users did this, that it would force Facebook to change. I don’t believe this to be true. It’s not because Facebook may be too big to fail, but it may be because that even if millions of people did leave, they would still have a substantive enough audience that brands would want to get in front of. They would still own businesses like Instagram and have other interests that could lure an advertising model.
Facebook is trying to find itself.
Google didn’t have the AdWords model right away. It took years to develop it and many more years to get it to where it is today. It was different, and it continues to be one of the purest definitions of native advertising that I have seen to date. Facebook is still figuring out what makes their model unique and effective (targeting, in this day and age may not be enough). Understanding sentiment and how that can shift from user to user isn’t that far of a stretch for a company that needs something really different to keep it on the media planners’ lists. The question that Facebook users need to ask themselves is this: "am I ok with Facebook running these types of experiments in order to find a powerful business model or is it simply not worth it and it’s time to delete my profile and move on?" Personally, the solution is simple: Facebook should ask their user base for individuals to volunteer for these experiments. To take part in these types of tests (without divulging too many details about the tests, so that it doesn’t dilute the responses). My guess is that they would be surprised by just how many people would be willing to play along (I think that I would play along!).
What do you think will happen?
How safe is your fitness tracker? Strava, the fitness app beloved by runners and cyclists…
Episode #956 of Six Pixels of Separation - The ThinkersOne Podcast is now live and…
Welcome to episode #956 of Six Pixels of Separation - The ThinkersOne Podcast. Ravin Jesuthasan…
Is there one link, story, picture or thought that you saw online this week that…
Can technology end the contentious debate over immigration? In the province of Quebec, the Parti…
Episode #955 of Six Pixels of Separation - The ThinkersOne Podcast is now live and…
This website uses cookies.