Facebook and Twitter data biased, Scientists say people portrait them falsely

Facebook and Twitter have been producing a lot of so-called “studies” as of late, and in the process have been making some pretty significant decisions about their advertising in the process. That being said, companies are listening to the results of these studies and taking the information that is presented within the results to heart. Like they should, right?

Actually, according to a different study – the opposite seems to be true. The study shows that data is presented in such a way, and then analyzed in such a way that it actually fails to account for a lot of errors, or personalized information that traditional studies account for. In other words, the data that’s being entered into the study isn’t nearly as reliable as one might have you believe when you’re reading the results of a study conducted by Facebook or Twitter.


There are a lot of potential issues with this. First, big businesses are making decisions on bad or biased information – instead of making decisions based on sound, or proven scientific methodology and fact. Remember, a lot of information is shared on Facebook, and while some of the information and the data that is being pulled from the site could be relevant for businesses in terms of identifying those worth advertising to, trying to estimate their behaviors based on those things might be significantly more challenging.

If nothing else, the act itself is significantly less-developed than Facebook or Twitter would have you believe. Believe it or not, according to Derek Ruths of McGill University and Jurgen Pfeffer of Carnegie Mellon even something as simple as Facebook not having a ‘dislike’ button severely offsets the validity of the information that is presented from their studies. Combine that with the fact that social media studies don’t take things like economic standing, background, bots, and spammers – which are fake accounts – and much more before analyzing their data.


The best example of the differences that Facebook doesn’t account for in their studies is the fact that Facebook “skews significantly female, young and (relatively) lower income,” according to research done by the Pew Research Center in 2013. And it turns out each social network has its own generalized flaws based on the types of people who use the platforms – so applying the findings from these studies, which have closer to a 65% accuracy rating compared to the 90% accuracy rating that the companies boast publicly.


Ruths went on to note that “the common thread in all these issues is the need for researchers to be more acutely aware of what they’re actually analyzing when working with social media data.” While some of the information taken from the studies is positive – and points to some serious opportunities, the overall package of information that is being presented needs to be analyzed a little more carefully in the process.