Module 1 – Week 8 – Research

I enjoyed this week’s course content regarding the importance of research and the ethical implications. The first exercise presented a series of Case Studies where you are placing yourself in the role of an Ethics Review Committee. This exercise demonstrated that there are instances where instinctively one can make a judgement as to whether a study is ethically sound and what risks require consideration. One can conclude from scant information when weighing up the benefits and risks of a study, however, this exercise also highlighted why it is pertinent to provide detailed information in the ethics and integrity template when submitting a research proposal for review. 

In the Challenge Brief, I was presented with 3 research scenarios and asked to rate them as low, medium, or high risk based on the ethics checklist. This was an interesting exercise as although I felt that I correctly identified the level of risk reading the discussion board I noticed that I had missed the consideration of the Cultural differences. When creating a proposal in my practice I will write out the full checklist so that I do not omit anything. 

Research:  

Need to Know: About Facebook’s Emotional Contagion Study 

Facebook conducted a large-scale experiment in January 2012 where data scientists from Cornell and Facebook study 689,003 participants for a week. The study aimed to establish “whether or not emotions are contagious online” (Kramer, Guillory and Hancock, 2014) 

“In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.” (Kramer, Guillory and Hancock, 2014).

The ethical issues with this study: 

Facebook conducted this study without the consent of the participants a clause was added to the Privacy Policy 4 months after the event. Vulnerable users under the age of 18 were included in the study. Also due to the volume of participants included the in my opinion the likelihood is that there would be participants that may have mental health conditions. 

Some academics have defended the study “everyone on the internet is doing A/B testing showing users two versions of something to see which resonates more based on how they click like, share, respond.” (Jeff Berovici) (Hill, 2014). However, the Facebook study set out with the intention to manipulate participants they were testing the emotional response and to see if the information would be harmful which Pam Dixon of the World Privacy Forum argued is different from A/B testing. 

Tal Yarkoni wrote a post in Deference of Facebook . That provided more detail around the nature and mechanics of the study, which I my view it less damaging than it first appeared. As content was removed from news feeds but no additional content was created and added for the experiment. Therefore users could have been exposed to that content anyway in normal usage. He also echoed Jedd Berovici sentiment that company’s like Amazon, Google and Facebook routinely run experiments to manipulate users and increase revenue, and the results of the research are used to provide an improved user experience.

Data Privacy

Reading about the Facebook study made me reflect on the Cambridge Analytica scandal. Where personal data of millions of Facebook users was shared with a company to be used for political advertising.  

I also thought about a connected news piece I watched on Channel 4 during the last presidential election in 2020. The piece focused on the Trump campaign strategy to deter black voters in the 2016 election. Data was leaked to 200 million Americans. The data was used and manipulated via an algorithm to categorise individuals into 8 audience types, this information was then used so that targeted political advertising could be pushed out on Facebook and other social media platforms. 

“One of the categories was named ‘Deterrence’, which was later described publicly by Trump’s chief data scientist as containing people that the campaign “hope don’t show up to vote” (Rabkin et al., 2020). The research that Channel 4 carried out found that Black Americans were included in the deterrence category in disproportionality high numbers. The 2016 Trump digital campaign involved a team from Cambridge Analytica. 

Conclusion

These studies highlight the importance of consent, data privacy and duty of care. As a User Experience practitioner having access to user data is critical but the users need to be aware of what information is being accessed and how it is being used. I want to ensure that participants feel empowered to consent or withdraw from the study at any time. I plan to refer to the University’s Research Ethics and Integrity Policy when researching my studies. 

Social Media is not as free as it initially appears, your data is a commodity, your behaviour will be studied, and you will be targeted by advertising content. 

“The Cambridge Analytica revelations may not have changed Facebook, but they did change us. Our eyes are now open. The question is what we will do.” (Wong, 2019) 

Sources

Rabkin, J., Basnett, G., Howker, E., Eastham, J. and Pett, H., 2020. Revealed: Trump campaign strategy to deter millions of Black Americans from voting in 2016. [online] Channel 4 News. Available at: <https://www.channel4.com/news/revealed-trump-campaign-strategy-to-deter-millions-of-black-americans-from-voting-in-2016&gt; [Accessed 24 March 2021].

Wong, J., 2019. The Cambridge Analytica scandal changed the world – but it didn’t change Facebook. [online] the Guardian. Available at: <https://www.theguardian.com/technology/2019/mar/17/the-cambridge-analytica-scandal-changed-the-world-but-it-didnt-change-facebook&gt; [Accessed 24 March 2021].

Hill, K., 2014. Facebook Added ‘Research’ To User Agreement 4 Months After Emotion Manipulation Study. [online] Forbes. Available at: <https://www.forbes.com/sites/kashmirhill/2014/06/30/facebook-only-got-permission-to-do-research-on-users-after-emotion-manipulation-study/?sh=4af32a967a62&gt; [Accessed 24 March 2021].

Kramer, A., Guillory, J. and Hancock, J., 2014. Experimental evidence of massive-scale emotional contagion through social networks. [online] PNAS. Available at: <https://www.pnas.org/content/111/24/8788&gt; [Accessed 24 March 2021].

McClure, L., 2014. Need to know: About Facebook’s emotional contagion study. [online] ideas.ted.com. Available at: <https://ideas.ted.com/need-to-know-about-facebooks-emotional-contagion-study/&gt; [Accessed 24 March 2021].

Yarkoni, T., 2014. In defense of Facebook. [Blog] http://www.talyarkoni.org/, Available at: <http://www.talyarkoni.org/blog/2014/06/28/in-defense-of-facebook/&gt; [Accessed 24 March 2021].

mentatdgt, 2021. [image] Available at: <https://www.pexels.com/@mentatdgt-330508&gt; [Accessed 24 April 2021].

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: