• Re: Apple accused of underreporting suspected CSAM on its platforms

    From Andrew@21:1/5 to Chips Loral on Tue Jul 30 16:48:08 2024
    XPost: misc.phone.mobile.iphone, alt.privacy

    Chips Loral wrote on Mon, 29 Jul 2024 18:10:29 -0600 :

    https://www.majorgeeks.com/content/page/stop_apple_scanning_iphone_photos.html
    Apple's new photo-scanning feature will scan photos stored in iCloud to
    see whether they match known Child Sexual Abuse Material (CSAM).

    Wow. I'm impressed. I thought there weren't any other adults on this Apple newsgroup, but apparently there are people who also read the news.

    I'm sure the Apple zealots, who only watch Apple's (admittedly brilliant) advertisements for all their news, will folllow their normal procedure of

    1. First they will brazenly deny any and all facts they didn't know
    2. When you persist and send them a link - they will NOT click on the
    link as they continue to brazenly deny all facts they don't know
    3. If you persist, they will say that the link doesn't support
    what they "think" you said - and they'll try to deflect that way)
    4. For a while they'll deflect the conversation any way their childish
    brains can think of (usually claiming Samsung made Apple do it)
    5. At some point they're forced to accept the fact but they will
    still childishly claim that the link isn't what you said it was
    6. Then their childish brains will concoct all sorts of excuses
    (usually claiming it's a bug so it's not Apple's fault after all,
    or they often use the excuse that Apple only "thought" about it)
    7. Finally, they'll start calling you vitriolic names because their
    childish brains can't handle any fact about Apple they don't like

    Just watch. They're so childish, they're 100% predictable.

    Apple zealots *hate* that Apple never is what Apple told them Apple was.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to Andrew on Tue Jul 30 09:54:38 2024
    XPost: misc.phone.mobile.iphone, alt.privacy

    On 2024-07-30 09:48, Andrew wrote:
    Chips Loral wrote on Mon, 29 Jul 2024 18:10:29 -0600 :

    https://www.majorgeeks.com/content/page/stop_apple_scanning_iphone_photos.html
    Apple's new photo-scanning feature will scan photos stored in iCloud to
    see whether they match known Child Sexual Abuse Material (CSAM).

    Wow. I'm impressed. I thought there weren't any other adults on this Apple newsgroup, but apparently there are people who also read the news.

    You're impressed because Loser-2 cited an article from 3 years ago that
    is addressing itself to something Apple didn't actually end up doing?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Chris on Tue Jul 30 16:57:42 2024
    XPost: misc.phone.mobile.iphone, alt.privacy

    Chris wrote on Tue, 30 Jul 2024 07:29:40 -0000 (UTC) :

    Because I only think logically, my rather sensible position has never
    changed,

    You have variously said that "CSAM is bullshit", "applauded Apple" for shelving their plan, that there were "zero convictions", and that the conviction rate "was the most important" thing.

    Chris,

    The fact you don't own adult comprehensive skills is not a reflection on me
    in that I said from the start that the only thing that matters is how
    effective the scanning is because it harms everyone.

    Without knowing how effective the scanning is, you can't make any
    assessments that it's effective.

    All anyone knows is that the most important metric is the conviction rate
    and since that was left out of the reports, that means they're bullshitting
    us.

    Because they're not that stupid.
    They KNOW the only thing that matters is the conviction rate.

    The fact they lambasted Apple without telling us that number is telling.
    It implies the conviction rate is dismal (perhaps even zero).

    Now I realize you have a PhD (you say) in the biological sciences, Chris,
    so I will assume that you clearly understand the difference between
    convictions and conviction rates.

    I don't expect Alan Baker, Alan Browne or Jolly Roger to comprehend the difference between convictions and conviction rates - as none of those
    low-IQ ignorant Apple zealots has earned even a High School diploma.

    But you... Chris... You claim to have earned a PhD, remember.
    So you KNOW the difference between convictions and conviction rates.

    Therefore, if you reply that you still can't figure the difference out, I'm going to have to call bullshit on your brazen claim to have earned a PhD.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Chips Loral@21:1/5 to Alan on Tue Jul 30 11:01:58 2024
    XPost: misc.phone.mobile.iphone, alt.privacy

    Alan wrote:
    something Apple didn't actually end up doing?


    https://www.majorgeeks.com/content/page/stop_apple_scanning_iphone_photos.html

    Apple’s new iPhone photo-scanning feature is a very controversial thing.
    You might want to consider the only current option to stop Apple from
    scanning your photos.

    Apple's new photo-scanning feature will scan photos stored in iCloud to
    see whether they match known Child Sexual Abuse Material (CSAM). The
    problem with this, like many others, is that we often have hundreds of
    photos of our children and grandchildren, and who knows how good or bad
    the new software scanning technology is? Apple claims false positives
    are one trillion to one, and there is an appeals process in place. That
    said, one mistake from this AI, just one, could have an innocent person
    sent to jail and their lives destroyed.

    Apple has many other features as part of these upgrades to protect
    children, and we like them all, but photo-scanning sounds like a problem waiting to happen.

    Here are all of the "features" that come with anti-CSAM, expected to
    roll out with iOS 15 in the fall of 2021.

    Messages: The Messages app will use on-device machine learning to warn
    children and parents about sensitive content.

    iCloud Photos: Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes.

    Siri and Search: Siri and Search will provide additional resources to
    help children and parents stay safe online and get help with unsafe
    situations.

    Now that you understand how anti-CSAM works, the only way to avoid
    having your photos scanned by this system is to disable iCloud Photos.
    Your photos are scanned when you automatically upload your photos to the
    cloud, so the only current way to avoid having them scanned is not to
    upload them.

    This adds an interesting problem. The majority of iPhone users use
    iCloud to back up their photos (and everything else). If you disable
    iCloud, you will need to back up your photos manually. If you have a PC
    or Mac, you can always copy them to your computer and back them up. You
    can also consider using another cloud service for backups.

    Let's talk about disabling iCloud and also removing any photos you
    already have uploaded. You will have 30 days to recover your photos if
    you change your mind. Any photos that are on your iPhone when iOS 15 is released will be scanned.

    You'll want to backup and disable iCloud, then verify that no photos
    were left on their servers.

    Stop Apple From Scanning Your iPhone Photos - Back-Up Photos and Disable
    iCloud Photos

    First, we can disable the uploading of iCloud photos while keeping all
    other backups, including your contacts, calendars, notes, and more.

    Click on Settings.

    At the top, click on your name.

    Click on iCloud.

    Click on Photos.

    Uncheck iCloud Photos.

    You will be prompted to decide what to do with your current photos.

    If you have the space on your phone, you can click on Download Photos &
    Videos, and your photos will all be on your iPhone, ready to back up
    somewhere else.

    Stop Apple From Scanning Your iPhone Photos - Delete Photos on Server

    While all of your photos should be deleted from Apple's server, we
    should verify that.

    Click on Settings.

    At the top, click on your name.

    Click on iCloud.

    Click on Manage Storage.

    Click on Photos.

    Click on Disable & Delete

    https://discussions.apple.com/thread/254538081?sortBy=rank

    https://www.youtube.com/watch?v=K_i8rTiXTd8

    How to disable Apple scanning your photos in iCloud and on device. The
    new iOS 15 update will scan iPhone photos and alert authorities if any
    of them contain CSAM. Apple Messages also gets an update to scan and
    warn parents if it detects an explicit image being sent or received.
    This video discusses the new Apple update, privacy implications, how to
    disable iPhone photo scanning, and offers a commentary on tech companies
    and the issue of privacy and electronic surveillance.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to Chips Loral on Tue Jul 30 10:06:16 2024
    XPost: misc.phone.mobile.iphone, alt.privacy

    On 2024-07-30 10:01, Chips Loral wrote:
    Alan wrote:
    something Apple didn't actually end up doing?


    https://www.majorgeeks.com/content/page/stop_apple_scanning_iphone_photos.html

    Apple’s new iPhone photo-scanning feature is a very controversial thing. You might want to consider the only current option to stop Apple from scanning your photos.

    This is all talking about something that Apple did not end up doing,
    Loser-2:

    'Apple’s Decision to Kill Its CSAM Photo-Scanning Tool Sparks Fresh Controversy'

    <https://www.wired.com/story/apple-csam-scanning-heat-initiative-letter/>

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to Chips Loral on Tue Jul 30 10:54:23 2024
    XPost: misc.phone.mobile.iphone, alt.privacy

    On 2024-07-30 10:45, Chips Loral wrote:
    Alan wrote:
    This is all talking about something that Apple did not end up doing,



    https://www.majorgeeks.com/content/page/stop_apple_scanning_iphone_photos.html

    Apple’s new iPhone photo-scanning feature is a very controversial thing. You might want to consider the only current option to stop Apple from scanning your photos.

    Why do you think this is a good strategy?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Chips Loral@21:1/5 to Alan on Tue Jul 30 11:45:04 2024
    XPost: misc.phone.mobile.iphone, alt.privacy

    Alan wrote:
    This is all talking about something that Apple did not end up doing,



    https://www.majorgeeks.com/content/page/stop_apple_scanning_iphone_photos.html

    Apple’s new iPhone photo-scanning feature is a very controversial thing.
    You might want to consider the only current option to stop Apple from
    scanning your photos.

    Apple's new photo-scanning feature will scan photos stored in iCloud to
    see whether they match known Child Sexual Abuse Material (CSAM). The
    problem with this, like many others, is that we often have hundreds of
    photos of our children and grandchildren, and who knows how good or bad
    the new software scanning technology is? Apple claims false positives
    are one trillion to one, and there is an appeals process in place. That
    said, one mistake from this AI, just one, could have an innocent person
    sent to jail and their lives destroyed.

    Apple has many other features as part of these upgrades to protect
    children, and we like them all, but photo-scanning sounds like a problem waiting to happen.

    Here are all of the "features" that come with anti-CSAM, expected to
    roll out with iOS 15 in the fall of 2021.

    Messages: The Messages app will use on-device machine learning to warn
    children and parents about sensitive content.

    iCloud Photos: Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes.

    Siri and Search: Siri and Search will provide additional resources to
    help children and parents stay safe online and get help with unsafe
    situations.

    Now that you understand how anti-CSAM works, the only way to avoid
    having your photos scanned by this system is to disable iCloud Photos.
    Your photos are scanned when you automatically upload your photos to the
    cloud, so the only current way to avoid having them scanned is not to
    upload them.

    This adds an interesting problem. The majority of iPhone users use
    iCloud to back up their photos (and everything else). If you disable
    iCloud, you will need to back up your photos manually. If you have a PC
    or Mac, you can always copy them to your computer and back them up. You
    can also consider using another cloud service for backups.

    Let's talk about disabling iCloud and also removing any photos you
    already have uploaded. You will have 30 days to recover your photos if
    you change your mind. Any photos that are on your iPhone when iOS 15 is released will be scanned.

    You'll want to backup and disable iCloud, then verify that no photos
    were left on their servers.

    Stop Apple From Scanning Your iPhone Photos - Back-Up Photos and Disable
    iCloud Photos

    First, we can disable the uploading of iCloud photos while keeping all
    other backups, including your contacts, calendars, notes, and more.

    Click on Settings.

    At the top, click on your name.

    Click on iCloud.

    Click on Photos.

    Uncheck iCloud Photos.

    You will be prompted to decide what to do with your current photos.

    If you have the space on your phone, you can click on Download Photos &
    Videos, and your photos will all be on your iPhone, ready to back up
    somewhere else.

    Stop Apple From Scanning Your iPhone Photos - Delete Photos on Server

    While all of your photos should be deleted from Apple's server, we
    should verify that.

    Click on Settings.

    At the top, click on your name.

    Click on iCloud.

    Click on Manage Storage.

    Click on Photos.

    Click on Disable & Delete

    https://discussions.apple.com/thread/254538081?sortBy=rank

    https://www.youtube.com/watch?v=K_i8rTiXTd8

    How to disable Apple scanning your photos in iCloud and on device. The
    new iOS 15 update will scan iPhone photos and alert authorities if any
    of them contain CSAM. Apple Messages also gets an update to scan and
    warn parents if it detects an explicit image being sent or received.
    This video discusses the new Apple update, privacy implications, how to
    disable iPhone photo scanning, and offers a commentary on tech companies
    and the issue of privacy and electronic surveillance.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to Chips Loral on Tue Jul 30 11:56:00 2024
    XPost: misc.phone.mobile.iphone, alt.privacy

    On 2024-07-30 11:54, Chips Loral wrote:
    Alan wrote:
    Why do you think this is a good strategy?



    https://www.majorgeeks.com/content/page/stop_apple_scanning_iphone_photos.html

    Apple’s new iPhone photo-scanning feature is a very controversial thing. You might want to consider the only current option to stop Apple from scanning your photos.

    Yada, yada...

    Why do you think posting the same thing over and over again...

    ...after you've been shown it's wrong...

    ...is a good strategy?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to badgolferman on Tue Jul 30 21:35:19 2024
    XPost: misc.phone.mobile.iphone, alt.privacy

    On 2024-07-30, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
    Andrew <andrew@spam.net> wrote:

    Wow. I'm impressed. I thought there weren't any other adults on this
    Apple newsgroup, but apparently there are people who also read the
    news.

    I'm sure the Apple zealots, who only watch Apple's (admittedly
    brilliant) advertisements for all their news, will folllow their
    normal procedure of

    1. First they will brazenly deny any and all facts they didn't know
    2. When you persist and send them a link - they will NOT click on the
    link as they continue to brazenly deny all facts they don't know 3.
    If you persist, they will say that the link doesn't support what they
    "think" you said - and they'll try to deflect that way) 4. For a
    while they'll deflect the conversation any way their childish brains
    can think of (usually claiming Samsung made Apple do it) 5. At some
    point they're forced to accept the fact but they will still
    childishly claim that the link isn't what you said it was 6. Then
    their childish brains will concoct all sorts of excuses (usually
    claiming it's a bug so it's not Apple's fault after all, or they
    often use the excuse that Apple only "thought" about it) 7. Finally,
    they'll start calling you vitriolic names because their childish
    brains can't handle any fact about Apple they don't like

    Just watch. They're so childish, they're 100% predictable.

    Apple zealots *hate* that Apple never is what Apple told them Apple
    was.

    Jolly Roger jumps to #7 immediately.

    badgolferman desperately wants everyone here to ignore the fact that
    Arlen has been belittling Apple users here for literal years, calling
    people "low IQ" "morons", and all sorts of other juvenile names. When
    someone has the balls to stand up to the bully and point out how
    childish Arlen's trolls are, badgolferman jumps to their defense like a
    white knight. He does this because he actually thinks the rest of us are
    just as gullible as he is. Unfortunately for him, that assumption
    couldn't be further from the truth. It's plain as day that badgolferman supports and enjoys Arlen's trolls. And the fact that he won't dare to
    actually say it and pretends it isn't the case shows the value of his character. Arlen and badgolferman: two peas in a pod.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)