Internet dating manipulation Xdating wpg
But many casual users don’t realize that Facebook is also a profound source of data for academic researchers, both inside and outside the company.After all, think about all the information that millions of people freely give to the site: not only their names and schools and locations, but their political affiliations, emotional states, and detailed maps about their social and information networks.Basically, researchers at Facebook wanted to know if the stuff you see in your newsfeed affects the way you feel.
That discrepancy rarely comes up when Facebook runs experiments for its own internal use, of course, but when the network publishes research — as in this case — other standards may have to apply.
When you see more positive things, you post more positive things.
When you see more negative things, you post more negative things. In all likelihood, Facebook didn’t manipulate your feelings personally.
Others have argued that, even if it does, Facebook should have to make that case to an independent institutional review board, just as researchers from academia do.
(While it was initially reported that Cornell’s IRB reviewed this study, Cornell has since said that isn’t true.) Ultimately, of course, ethics are a matter of social opinion.
First off, the experiment only affected a tiny fraction of users. Kramer, one of the study’s authors, defended his methodology in a public Facebook post Sunday: “At the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it — the result was that people produced an average of one fewer emotional word, per thousand words, over the following week.” Technically, yes — by logging on.