Sysop: | Amessyroom |
---|---|
Location: | Fayetteville, NC |
Users: | 27 |
Nodes: | 6 (0 / 6) |
Uptime: | 38:00:38 |
Calls: | 631 |
Calls today: | 2 |
Files: | 1,187 |
D/L today: |
22 files (29,767K bytes) |
Messages: | 173,681 |
Meta one said rCLI think the suggestion that our services are designed to be intentionally addictive really misrepresents our intentions and the
work that we do do.rCY
The Efc|Efc+ Parliament is holding a hearing into the potential for harm to >young people from social media ><https://www.nzherald.co.nz/nz/politics/meta-claims-instagram-scrolling-not-intentionally-addictive-as-mps-probe-social-media-harm/GTQ3TWLI4NHLDJJMGXUPWPQHUE/>.
Both Meta and Tiktok sent representatives, and the Meta one said rCLI
think the suggestion that our services are designed to be
intentionally addictive really misrepresents our intentions and the
work that we do do.rCY
Doo-doo, indeed. Remember when Frances Haugen blew the whistle on
MetarCOs internal research proving that they *knew* their operations
were harming young people? Instead of fixing the harm, their response
was to forbid any further research. ThatrCOs the only way they can claim
that their rCLservicesrCY are not *intentionally* addictive -- by simply >refusing to look at any evidence showing that it is.
Kind of sounds like how the cigarette companies denied the
addictiveness (and adverse health effects) of their products, back in
the day. Admittedly, they went further, by setting up a whole
rCLresearch instituterCY that did its own fake rCLsciencerCY propaganda to >push back against the increasing flood of findings from the
independent research community that the products really were harmful.
Unlike the real, physical world, itrCOs a bit difficult to do
independent research in a virtual world where your every action is
subject to the control of the organization that doesnrCOt want anybody >conducting such research ...