ranking are that facebook knows that content that gets an extreme reaction from you is more likely to get a click, a comment or reshare and it s interesting because those clicks and comments and reshares aren t even necessarily for your benefit. it s because they know that other people will produce more content if they get the likes and comments and reshares. they prioritize content in your feed so that you will give little hits of dopamine to your friends so they ll create more content and they have run experiments on people where they have confirmed this. you and part of your information you provided the wall street journal, it s been found facebook altered its algorithm in attempt to boost these meaningful social interactions but rather than strengthening bonds, it awarded more sensationalism. i think they d say they use it
they turned off downstream msi when it was health content, probably covid, and civic content. but facebook s own algorithms are bad at finding this content. it s still in the raw form for 80% of that sensitive content. in countries where they don t have integrity systems in the local language, and in ethiopia, there are 100 million people and only supports six language. this strategy of focusing on systems, ai, to save us, so doomed to fail. first of all, i m sending a letter to facebook today. they better not delete any information as it relates to the ro rohinga or the investigation. but aren t we also talking about add advertising fraud? aren t you selling something to advertisers that s not really