• Re: Apple accused of underreporting suspected CSAM on its platforms

    From Jolly Roger@21:1/5 to Andrew on Sun Jul 28 18:25:18 2024
    XPost: alt.privacy

    On 2024-07-28, Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 28 Jul 2024 16:16:59 GMT :

    Liar - here are your own words, little Arlen, where you say "absolutely
    zero" were caught and convicted:

    You not comprehending

    You trying to move the goal post is how I know you're a low-level troll.
    You can't escape your own words, little Arlen. Trim them all you want,
    they are a matter of public record forvever:

    On 2024-07-24, Andrew <andrew@spam.net> wrote:

    Given, for all we know, absolutely zero pedophiles were caught and
    convicted by the Meta & Google (and even Apple) system, the safety
    gained is zero.

    Wrong.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Andrew on Sun Jul 28 18:26:02 2024
    XPost: alt.privacy

    On 2024-07-28, Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 28 Jul 2024 16:21:06 GMT :

    You don't remember your own words, where you claimed there have been
    "absolutely zero" pedophiles caught and convicted as a result of CSAM
    scanning? Here, let's refresh your rotten memory:

    You not comprehending the difference between zero percent of Apple
    reports versus zero total convictions is how I know you zealots own
    subnormal IQs.

    You trying to move the goal post is how I know you're a low-level troll,
    little Arlen. You can't escape your own words:

    On 2024-07-24, Andrew <andrew@spam.net> wrote:

    Given, for all we know, absolutely zero pedophiles were caught and
    convicted by the Meta & Google (and even Apple) system, the safety
    gained is zero.

    Wrong.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Chris on Sun Jul 28 18:49:20 2024
    On 2024-07-28, Chris <ithinkiam@gmail.com> wrote:
    Jolly Roger <jollyroger@pobox.com> wrote:
    On 2024-07-26, Alan <nuh-uh@nope.com> wrote:
    On 2024-07-26 09:11, Jolly Roger wrote:
    On 2024-07-26, Chris <ithinkiam@gmail.com> wrote:
    On 24/07/2024 22:35, Jolly Roger wrote:
    On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
    Andrew <andrew@spam.net> wrote:
    Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :

    The NSPCC should really be complaining at how ineffectual the >>>>>>>>> tech companies are rather than complain at Apple for not
    sending millions of photos to already overwhelmed authorities. >>>>>>>>
    For all that is in the news stories, it could be ZERO
    convictions resulted.

    Think about that.

    Is it worth everyone's loss of privacy for maybe zero gain in
    child safety?

    Apple's solution wouldn't have resulted in any additional loss
    of privacy

    Actually, Apple could not guarantee that, and there was a
    non-zero chance that false positive matches would result in
    privacy violations.

    True. The balance of risk was proportionate, however. Much moreso
    than the current system.

    Absolutely. I'm just of the opinion if one innocent person is
    harmed, that's one too many. Would you want to be that unlucky
    innocent person who has to deal with charges, a potential criminal
    sexual violation on your record, and all that comes with it? I
    certainly wouldn't.

    Except that Apple's system wouldn't automatically trigger charges.

    An actual human would review the images in question...

    And at that point, someone's privacy may be violated.

    You're entering into confucious territory. If nothing is triggered is anyone's privacy infringed.

    You're claiming innocent photos would never match, but there is a
    possibility of false matches inherent in the algorithm, no matter how
    small.

    Do you want a stranger looking at photos of your sick child?

    That wouldn't happen with Apple's method.

    It would. If a sufficient number of images matched Apple's algorithms
    (which are not perfect and allow for false matches), a human being would
    be looking at those photos of your naked sick child. How else do you
    think Apple would determine whether the images in question are or are
    not CSAM? And what happens when that stranger decides "You know what? I
    think these photos are inappropriate even if they don't match known CSAM"?

    What if that stranger came to the conclusion that those photos are
    somehow classifiable as sexual or abusive in some way? Would you want
    to have to argue your case in court because of it?

    That's a lot of ifs and steps.

    Yes, but it's possible.

    No-one is going to be charged for a dubious
    photo of their own child. There are much bigger fish to fry and get into jail.

    You're wrong. It has already happened:

    A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged
    Him as a Criminal
    <https://archive.is/78Pla#selection-563.0-1075.217>

    Read the whole article to get a glimpse of what innocent people go
    through who fall victim to this invasive scanning.

    Do you think these parents and their child consider their privacy to be violated? How would you feel if your intimate photos were added to the
    PhotoDNA CSAM database because they were incorrectly flagged?

    ---
    In 2021, the CyberTipline reported that it had alerted authorities
    to “over 4,260 potential new child victims.” The sons of Mark and Cassio were counted among them.
    ---

    A lot of really bad things can happen to good people:

    ---
    “This would be problematic if it were just a case of content
    moderation and censorship,” Ms. Klonick said. “But this is doubly
    dangerous in that it also results in someone being reported to law enforcement.” It could have been worse, she said, with a parent
    potentially losing custody of a child. “You could imagine how this might escalate,” Ms. Klonick said.
    ---

    ...AND since they were comparing images against KNOWN CSAM, false
    positives would naturally be very few to begin with.

    Yes, but one is one too many in my book.

    How many children are you prepared to be abused to protect YOUR
    privacy?

    Now you're being absurd. My right to privacy doesn't cause any children
    to be abused.

    Apple was wise to shelve this proposal
  • From Andrew@21:1/5 to Chris on Mon Jul 29 11:27:12 2024
    Chris wrote on Mon, 29 Jul 2024 07:36:46 -0000 (UTC) :

    I think you need to have a lie down. You literally are making no sense anymore.

    It's not surprising that you uneducated zealots can't follow simple logic.

    What matters isn't the number of reports but the percentage of convictions.

    That you fail to comprehend a concept so obvious and simple, is why I have ascertained you zealots are of low IQ as this is not a difficult concept.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrew@21:1/5 to Chris on Mon Jul 29 11:23:48 2024
    XPost: alt.privacy

    Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :

    You not comprehending the difference between zero percent of Apple reports >> versus zero total convictions is how I know you zealots own subnormal IQs.

    Not at all. My position hasn't changed. You, however, have had about three different positions on this thread and keep getting confused which one
    you're arguing for. lol.

    Au contraire

    Because I only think logically, my rather sensible position has never
    changed, Chris, and the fact you "think" it has changed is simply that you don't know the difference between the percentage of convictions based on
    the number of reports, and the total number of convictions.

    When you figure out that those two things are different, then (and only
    then) will you realize I've maintained the same position throughout.

    Specifically....

    a. If the Apple reporting rate is low, and yet if their conviction
    rate is high (based on the number of reports), then they are NOT
    underreporting images.

    b. If the FB/Google reporting rate is high, and yet if their conviction
    rate is low (based on the number of reports), then they are
    overreporting images.

    c. None of us know if either is true unless and until we know the
    conviction rates per Apple, Facebook, & Google - which are not
    in the reports (which were aimed to lambaste Apple).

    d. That conviction rate information is so important, that nobody
    is so stupid to not ASK for it BEFORE making any assessments.

    e. Given the people who wrote those reports are not likely to be
    stupid, the fact they left out the most important factor,
    directly implies the obvious, based on the omission itself.

    Now what do you think that omitted fact directly implies, Chris?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Chris on Mon Jul 29 15:25:36 2024
    On 2024-07-29, Chris <ithinkiam@gmail.com> wrote:
    Jolly Roger <jollyroger@pobox.com> wrote:
    On 2024-07-28, Chris <ithinkiam@gmail.com> wrote:

    No-one is going to be charged for a dubious photo of their own
    child. There are much bigger fish to fry and get into jail.

    You're wrong. It has already happened:

    A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged
    Him as a Criminal <https://archive.is/78Pla#selection-563.0-1075.217>

    I explicitly said "charged". No-one got charged. The law is working
    just fine. It's the tech, as I've been arguing all along, that's the
    problem.

    So it's okay that these parents and their child had their privacy
    violated, their child's naked photos added to the CSAM database, and
    their accounts (along with years of emails, photos, and so on)
    revoked/deleted because they were officially charged?

    Read the whole article to get a glimpse of what innocent people go
    through who fall victim to this invasive scanning.

    Do you think these parents and their child consider their privacy to
    be violated? How would you feel if your intimate photos were added to
    the PhotoDNA CSAM database because they were incorrectly flagged?

    This wasn't PhotoDNA, which is what Apple was similar to. It was
    google's AI method that is designed to "recognize never-before-seen exploitative images of children" which is where the real danger sits.

    It is designed to identify new abuse images based on only the pixel
    data so all hits will be massively enriched for things that look like
    abuse. A human won't have the ability to accurately identify the
    (likely innocent) motivation for taking photo and "to be safe" will
    pass it onto someone else make the decision i.e. law enforcement. The
    LE will have access to much more information and see it's an obvious
    mistake as seen in your article.

    Actually, a human being does review it with Google's system:

    ---
    A human content moderator for Google would have reviewed the photos
    after they were flagged by the artificial intelligence to confirm they
    met the federal definition of child sexual abuse material. When Google
    makes such a discovery, it locks the user’s account, searches for other exploitative material and, as required by federal law, makes a report to
    the CyberTipline at the National Center for Missing and Exploited
    Children.
    ---

    Apple's system was more like hashing the image data and comparing
    hashes where false positives are due to algorithmic randomness. The
    pixel data when viewed by a human won't be anything like CSAM and an
    easy decision made.

    What's crucial here is that Google are looking for new stuff - which
    is always problematic - whereas Apple's was not. The search space when looking for existing images is much tinier and the impact of false
    positives much, much smaller.

    Yes, but even in Apple's case, there's a small change of a false
    positive patch.
  • From Alan@21:1/5 to Andrew on Mon Jul 29 11:42:00 2024
    XPost: alt.privacy

    On 2024-07-29 04:23, Andrew wrote:
    Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :

    You not comprehending the difference between zero percent of Apple reports >>> versus zero total convictions is how I know you zealots own subnormal IQs. >>
    Not at all. My position hasn't changed. You, however, have had about three >> different positions on this thread and keep getting confused which one
    you're arguing for. lol.

    Au contraire

    Because I only think logically, my rather sensible position has never changed, Chris, and the fact you "think" it has changed is simply that you don't know the difference between the percentage of convictions based on
    the number of reports, and the total number of convictions.

    When you figure out that those two things are different, then (and only
    then) will you realize I've maintained the same position throughout.

    Specifically....

    a. If the Apple reporting rate is low, and yet if their conviction
    rate is high (based on the number of reports), then they are NOT
    underreporting images.

    Apple's reporting rate is ZERO, because they're not doing scanning of
    images of any kind.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to badgolferman on Mon Jul 29 13:11:32 2024
    On 2024-07-29 13:04, badgolferman wrote:
    Chris <ithinkiam@gmail.com> wrote:
    Jolly Roger <jollyroger@pobox.com> wrote:
    On 2024-07-29, Chris <ithinkiam@gmail.com> wrote:

    Actually, a human being does review it with Google's system:

    I was unclear. I'm not saying a human doesn't review, I'm saying that given >> the dozens/hundreds of images of suspected abuse images they review a day
    they won't have the ability to make informed decisions.

    ---
    A human content moderator for Google would have reviewed the photos
    after they were flagged by the artificial intelligence to confirm they
    met the federal definition of child sexual abuse material.

    What kind of a person would want this job?

    There are lots of jobs that need doing that very few want to do.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to Chris on Mon Jul 29 13:40:14 2024
    On 2024-07-29 13:38, Chris wrote:
    On 29/07/2024 21:04, badgolferman wrote:
    Chris <ithinkiam@gmail.com> wrote:
    Jolly Roger <jollyroger@pobox.com> wrote:
    On 2024-07-29, Chris <ithinkiam@gmail.com> wrote:

    Actually, a human being does review it with Google's system:

    I was unclear. I'm not saying a human doesn't review, I'm saying that
    given
    the dozens/hundreds of images of suspected abuse images they review a
    day
    they won't have the ability to make informed decisions.

    ---
    A human content moderator for Google would have reviewed the photos
    after they were flagged by the artificial intelligence to confirm they >>>> met the federal definition of child sexual abuse material.

    What kind of a person would want this job?

    I read an article a couple of years ago on the Facebook content
    moderators. Many ended up traumatised and got no support. God it was a
    grim read.

    I think I recall that as well.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Chips Loral@21:1/5 to Alan on Mon Jul 29 16:11:52 2024
    XPost: alt.privacy

    Alan wrote:
    On 2024-07-29 04:23, Andrew wrote:
    Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :

    You not comprehending the difference between zero percent of Apple
    reports
    versus zero total convictions is how I know you zealots own
    subnormal IQs.

    Not at all. My position hasn't changed. You, however, have had about
    three
    different positions on this thread and keep getting confused which one
    you're arguing for. lol.

    Au contraire

    Because I only think logically, my rather sensible position has never
    changed, Chris, and the fact you "think" it has changed is simply that
    you
    don't know the difference between the percentage of convictions based on
    the number of reports, and the total number of convictions.

    When you figure out that those two things are different, then (and only
    then) will you realize I've maintained the same position throughout.

    Specifically....

    a. If the Apple reporting rate is low, and yet if their conviction
        rate is high (based on the number of reports), then they are NOT
        underreporting images.

    Apple's reporting rate is ZERO, because they're not doing scanning of
    images of any kind.

    After getting caught.

    You can't seem to get ANYTHING right, Mac-troll:

    https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/

    In August 2021, Apple announced a plan to scan photos that users stored
    in iCloud for child sexual abuse material (CSAM). The tool was meant to
    be privacy-preserving and allow the company to flag potentially
    problematic and abusive content without revealing anything else. But the initiative was controversial, and it soon drew widespread criticism from privacy and security researchers and digital rights groups who were
    concerned that the surveillance capability itself could be abused to
    undermine the privacy and security of iCloud users around the world. At
    the beginning of September 2021, Apple said it would pause the rollout
    of the feature to “collect input and make improvements before releasing
    these critically important child safety features.” In other words, a
    launch was still coming.

    Parents and caregivers can opt into the protections through family
    iCloud accounts. The features work in Siri, Apple’s Spotlight search,
    and Safari Search to warn if someone is looking at or searching for
    child sexual abuse materials and provide resources on the spot to report
    the content and seek help.

    https://sneak.berlin/20230115/macos-scans-your-local-files-now/

    Preface: I don’t use iCloud. I don’t use an Apple ID. I don’t use the
    Mac App Store. I don’t store photos in the macOS “Photos” application, even locally. I never opted in to Apple network services of any kind - I
    use macOS software on Apple hardware.

    Today, I was browsing some local images in a subfolder of my Documents
    folder, some HEIC files taken with an iPhone and copied to the Mac using
    the Image Capture program (used for dumping photos from an iOS device
    attached with an USB cable).

    I use a program called Little Snitch which alerts me to network traffic attempted by the programs I use. I have all network access denied for a
    lot of Apple OS-level apps because I’m not interested in transmitting
    any of my data whatsoever to Apple over the network - mostly because
    Apple turns over customer data on over 30,000 customers per year to US
    federal police without any search warrant per Apple’s own self-published transparency report. I’m good without any of that nonsense, thank you.

    Imagine my surprise when browsing these images in the Finder, Little
    Snitch told me that macOS is now connecting to Apple APIs via a program
    named mediaanalysisd (Media Analysis Daemon - a background process for analyzing media files).

    ...


    Integrate this data and remember it: macOS now contains network-based
    spyware even with all Apple services disabled. It cannot be disabled via controls within the OS: you must used third party network filtering
    software (or external devices) to prevent it.

    This was observed on the current version of macOS, macOS Ventura 13.1.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to Chips Loral on Mon Jul 29 16:21:12 2024
    XPost: alt.privacy

    On 2024-07-29 15:11, Chips Loral wrote:
    Alan wrote:
    On 2024-07-29 04:23, Andrew wrote:
    Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :

    You not comprehending the difference between zero percent of Apple
    reports
    versus zero total convictions is how I know you zealots own
    subnormal IQs.

    Not at all. My position hasn't changed. You, however, have had about
    three
    different positions on this thread and keep getting confused which one >>>> you're arguing for. lol.

    Au contraire

    Because I only think logically, my rather sensible position has never
    changed, Chris, and the fact you "think" it has changed is simply
    that you
    don't know the difference between the percentage of convictions based on >>> the number of reports, and the total number of convictions.

    When you figure out that those two things are different, then (and only
    then) will you realize I've maintained the same position throughout.

    Specifically....

    a. If the Apple reporting rate is low, and yet if their conviction
        rate is high (based on the number of reports), then they are NOT
        underreporting images.

    Apple's reporting rate is ZERO, because they're not doing scanning of
    images of any kind.

    After getting caught.

    You can't seem to get ANYTHING right, Mac-troll:

    https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/

    In August 2021, Apple announced a plan to scan photos that users stored
    in iCloud for child sexual abuse material (CSAM). The tool was meant to
    be privacy-preserving and allow the company to flag potentially
    problematic and abusive content without revealing anything else. But the initiative was controversial, and it soon drew widespread criticism from privacy and security researchers and digital rights groups who were
    concerned that the surveillance capability itself could be abused to undermine the privacy and security of iCloud users around the world. At
    the beginning of September 2021, Apple said it would pause the rollout
    of the feature to “collect input and make improvements before releasing these critically important child safety features.” In other words, a
    launch was still coming.

    Parents and caregivers can opt into the protections through family
    iCloud accounts. The features work in Siri, Apple’s Spotlight search,
    and Safari Search to warn if someone is looking at or searching for
    child sexual abuse materials and provide resources on the spot to report
    the content and seek help.

    https://sneak.berlin/20230115/macos-scans-your-local-files-now/

    Preface: I don’t use iCloud. I don’t use an Apple ID. I don’t use the Mac App Store. I don’t store photos in the macOS “Photos” application, even locally. I never opted in to Apple network services of any kind - I
    use macOS software on Apple hardware.

    Today, I was browsing some local images in a subfolder of my Documents folder, some HEIC files taken with an iPhone and copied to the Mac using
    the Image Capture program (used for dumping photos from an iOS device attached with an USB cable).

    I use a program called Little Snitch which alerts me to network traffic attempted by the programs I use. I have all network access denied for a
    lot of Apple OS-level apps because I’m not interested in transmitting
    any of my data whatsoever to Apple over the network - mostly because
    Apple turns over customer data on over 30,000 customers per year to US federal police without any search warrant per Apple’s own self-published transparency report. I’m good without any of that nonsense, thank you.

    Imagine my surprise when browsing these images in the Finder, Little
    Snitch told me that macOS is now connecting to Apple APIs via a program
    named mediaanalysisd (Media Analysis Daemon - a background process for analyzing media files).

    ...


    Integrate this data and remember it: macOS now contains network-based
    spyware even with all Apple services disabled. It cannot be disabled via controls within the OS: you must used third party network filtering
    software (or external devices) to prevent it.

    This was observed on the current version of macOS, macOS Ventura 13.1.


    'A recent thread on Twitter raised concerns that the macOS process mediaanalysisd, which scans local photos, was secretly sending the
    results to an Apple server. This claim was made by a cybersecurity
    researcher named Jeffrey Paul. However, after conducting a thorough
    analysis of the process, it has been determined that this is not the case.'

    <https://pawisoon.medium.com/debunked-the-truth-about-mediaanalysisd-and-apples-access-to-your-local-photos-on-macos-a42215e713d1>

    'The mediaanalysisd process is a background task that starts every time
    an image file is previewed in Finder, and then calls an Apple service.
    The process is designed to run machine learning algorithms to detect
    objects in photos and make object-based search possible in the Photos
    app. It also helps Finder to detect text and QR codes in photos. Even if
    a user does not use the Photos app or have an iCloud account, the
    process will still run.'

    Apple is not scanning your photos for CSAM

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Chris on Mon Jul 29 23:35:29 2024
    On 2024-07-29, Chris <ithinkiam@gmail.com> wrote:
    Jolly Roger <jollyroger@pobox.com> wrote:

    Yes, but even in Apple's case, there's a small change of a false
    positive patch. And were that to happen, there is a danger of an
    innocent person's privacy being violated.

    In every case there's a chance of FPs. Apple would have had lower FPR then *the current* system.

    Given the choice I'm in favour of the better, evidence-based method.

    You're wrong. The choices now are:

    - Use systems and services where CSAM scanning disregards your privacy.

    - Use systems and services that do no CSAM scanning of private content.

    The latter happens to be Apple's systems and services (with the singular exception of email).

    You're in favour of the worse system

    Nope. I have never said that. I'm in favor of no CSAM scanning of
    private content.

    Nope. I don't support any scanning of private content.

    Yet it's already happening so why not support the better method?

    Speak for yourself. It's certainly not happening to my private content.

    I agree - still not good enough for me though.

    "Perfect is the enemy of the good"

    By seeking perfection you and others are allowing and enabling child
    abuse.

    Nah. There is no child abuse occurring in my private content, and my
    decision not to use or support privacy-violating technology isn't
    harming anyone.

    Apple only shelved it for PR reasons, which is a real shame.

    You don't know all of Apple's motivations. What we know is Apple shelved
    it after gathering feedback from industry experts. And many of those
    experts were of the opinion that even with Apple's precautions, the risk
    of violating people's privacy was too great.

    That wasn't the consensus. The noisy tin-foil brigade drowned out any possible discussion. i

    Not true. There was plenty of collaboration and discussion. Here's what
    Apple said about their decision:

    ---
    "Child sexual abuse material is abhorrent and we are committed to
    breaking the chain of coercion and influence that makes children
    susceptible to it," Erik Neuenschwander, Apple's director of user
    privacy and child safety, wrote in the company's response to Heat
    Initiative. He added, though, that after collaborating with an array of
    privacy and security researchers, digital rights groups, and child
    safety advocates, the company concluded that it could not proceed with development of a CSAM-scanning mechanism, even one built specifically to preserve privacy.

    "Scanning every user's privately stored iCloud data would create new
    threat vectors for data thieves to find and exploit," Neuenschwander
    wrote. "It would also inject the potential for a slippery slope of
    unintended consequences. Scanning for one type of content, for instance,
    opens the door for bulk surveillance and could create a desire to search
    other encrypted messaging systems across content types."
    ---

    Apple should have sim
  • From Jolly Roger@21:1/5 to badgolferman on Mon Jul 29 23:36:41 2024
    On 2024-07-29, badgolferman <REMOVETHISbadgolferman@gmail.com> wrote:
    Chris <ithinkiam@gmail.com> wrote:
    Jolly Roger <jollyroger@pobox.com> wrote:
    On 2024-07-29, Chris <ithinkiam@gmail.com> wrote:

    Actually, a human being does review it with Google's system:

    I was unclear. I'm not saying a human doesn't review, I'm saying that
    given the dozens/hundreds of images of suspected abuse images they
    review a day they won't have the ability to make informed decisions.

    --- A human content moderator for Google would have reviewed the
    photos after they were flagged by the artificial intelligence to
    confirm they met the federal definition of child sexual abuse
    material.

    What kind of a person would want this job?

    I'll give you three guesses.

    Guess what kind of people are most attracted to positions of power (such
    as police)?

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Alan on Mon Jul 29 23:39:58 2024
    XPost: alt.privacy

    On 2024-07-29, Alan <nuh-uh@nope.com> wrote:
    On 2024-07-29 04:23, Andrew wrote:
    Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :

    You not comprehending the difference between zero percent of Apple reports >>>> versus zero total convictions is how I know you zealots own subnormal IQs. >>>
    Not at all. My position hasn't changed. You, however, have had about three >>> different positions on this thread and keep getting confused which one
    you're arguing for. lol.

    Au contraire

    Because I only think logically, my rather sensible position has never
    changed, Chris, and the fact you "think" it has changed is simply that you >> don't know the difference between the percentage of convictions based on
    the number of reports, and the total number of convictions.

    When you figure out that those two things are different, then (and only
    then) will you realize I've maintained the same position throughout.

    Specifically....

    a. If the Apple reporting rate is low, and yet if their conviction
    rate is high (based on the number of reports), then they are NOT
    underreporting images.

    Apple's reporting rate is ZERO, because they're not doing scanning of
    images of any kind.

    It's zero for *photos*, but not for *email*:

    <https://9to5mac.com/2021/08/23/apple-scans-icloud-mail-for-csam/>

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Chips Loral@21:1/5 to Alan on Mon Jul 29 18:10:29 2024
    XPost: alt.privacy

    Alan wrote:
    On 2024-07-29 15:11, Chips Loral wrote:
    Alan wrote:
    On 2024-07-29 04:23, Andrew wrote:
    Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :

    You not comprehending the difference between zero percent of Apple >>>>>> reports
    versus zero total convictions is how I know you zealots own
    subnormal IQs.

    Not at all. My position hasn't changed. You, however, have had
    about three
    different positions on this thread and keep getting confused which one >>>>> you're arguing for. lol.

    Au contraire

    Because I only think logically, my rather sensible position has never
    changed, Chris, and the fact you "think" it has changed is simply
    that you
    don't know the difference between the percentage of convictions
    based on
    the number of reports, and the total number of convictions.

    When you figure out that those two things are different, then (and only >>>> then) will you realize I've maintained the same position throughout.

    Specifically....

    a. If the Apple reporting rate is low, and yet if their conviction
        rate is high (based on the number of reports), then they are NOT >>>>     underreporting images.

    Apple's reporting rate is ZERO, because they're not doing scanning of
    images of any kind.

    After getting caught.

    You can't seem to get ANYTHING right, Mac-troll:

    https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/


    In August 2021, Apple announced a plan to scan photos that users
    stored in iCloud for child sexual abuse material (CSAM). The tool was
    meant to be privacy-preserving and allow the company to flag
    potentially problematic and abusive content without revealing anything
    else. But the initiative was controversial, and it soon drew
    widespread criticism from privacy and security researchers and digital
    rights groups who were concerned that the surveillance capability
    itself could be abused to undermine the privacy and security of iCloud
    users around the world. At the beginning of September 2021, Apple said
    it would pause the rollout of the feature to “collect input and make
    improvements before releasing these critically important child safety
    features.” In other words, a launch was still coming.

    Parents and caregivers can opt into the protections through family
    iCloud accounts. The features work in Siri, Apple’s Spotlight search,
    and Safari Search to warn if someone is looking at or searching for
    child sexual abuse materials and provide resources on the spot to
    report the content and seek help.

    https://sneak.berlin/20230115/macos-scans-your-local-files-now/

    Preface: I don’t use iCloud. I don’t use an Apple ID. I don’t use the >> Mac App Store. I don’t store photos in the macOS “Photos” application, >> even locally. I never opted in to Apple network services of any kind -
    I use macOS software on Apple hardware.

    Today, I was browsing some local images in a subfolder of my Documents
    folder, some HEIC files taken with an iPhone and copied to the Mac
    using the Image Capture program (used for dumping photos from an iOS
    device attached with an USB cable).

    I use a program called Little Snitch which alerts me to network
    traffic attempted by the programs I use. I have all network access
    denied for a lot of Apple OS-level apps because I’m not interested in
    transmitting any of my data whatsoever to Apple over the network -
    mostly because Apple turns over customer data on over 30,000 customers
    per year to US federal police without any search warrant per Apple’s
    own self-published transparency report. I’m good without any of that
    nonsense, thank you.

    Imagine my surprise when browsing these images in the Finder, Little
    Snitch told me that macOS is now connecting to Apple APIs via a
    program named mediaanalysisd (Media Analysis Daemon - a background
    process for analyzing media files).

    ...


    Integrate this data and remember it: macOS now contains network-based
    spyware even with all Apple services disabled. It cannot be disabled
    via controls within the OS: you must used third party network
    filtering software (or external devices) to prevent it.

    This was observed on the current version of macOS, macOS Ventura 13.1.


    'A recent thread on Twitter raised concerns that the macOS process mediaanalysisd, which scans local photos, was secretly sending the
    results to an Apple server. This claim was made by a cybersecurity
    researcher named Jeffrey Paul. However, after conducting a thorough
    analysis of the process, it has been determined that this is not the case.'



    Bullshit.

    https://www.majorgeeks.com/content/page/stop_apple_scanning_iphone_photos.html

    Apple’s new iPhone photo-scanning feature is a very controversial thing.
    You might want to consider the only current option to stop Apple from
    scanning your photos.

    Apple's new photo-scanning feature will scan photos stored in iCloud to
    see whether they match known Child Sexual Abuse Material (CSAM). The
    problem with this, like many others, is that we often have hundreds of
    photos of our children and grandchildren, and who knows how good or bad
    the new software scanning technology is? Apple claims false positives
    are one trillion to one, and there is an appeals process in place. That
    said, one mistake from this AI, just one, could have an innocent person
    sent to jail and their lives destroyed.

    Apple has many other features as part of these upgrades to protect
    children, and we like them all, but photo-scanning sounds like a problem waiting to happen.

    Here are all of the "features" that come with anti-CSAM, expected to
    roll out with iOS 15 in the fall of 2021.

    Messages: The Messages app will use on-device machine learning to warn
    children and parents about sensitive content.

    iCloud Photos: Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes.

    Siri and Search: Siri and Search will provide additional resources to
    help children and parents stay safe online and get help with unsafe
    situations.

    Now that you understand how anti-CSAM works, the only way to avoid
    having your photos scanned by this system is to disable iCloud Photos.
    Your photos are scanned when you automatically upload your photos to the
    cloud, so the only current way to avoid having them scanned is not to
    upload them.

    This adds an interesting problem. The majority of iPhone users use
    iCloud to back up their photos (and everything else). If you disable
    iCloud, you will need to back up your photos manually. If you have a PC
    or Mac, you can always copy them to your computer and back them up. You
    can also consider using another cloud service for backups.

    Let's talk about disabling iCloud and also removing any photos you
    already have uploaded. You will have 30 days to recover your photos if
    you change your mind. Any photos that are on your iPhone when iOS 15 is released will be scanned.

    You'll want to backup and disable iCloud, then verify that no photos
    were left on their servers.

    Stop Apple From Scanning Your iPhone Photos - Back-Up Photos and Disable
    iCloud Photos

    First, we can disable the uploading of iCloud photos while keeping all
    other backups, including your contacts, calendars, notes, and more.

    Click on Settings.

    At the top, click on your name.

    Click on iCloud.

    Click on Photos.

    Uncheck iCloud Photos.

    You will be prompted to decide what to do with your current photos.

    If you have the space on your phone, you can click on Download Photos &
    Videos, and your photos will all be on your iPhone, ready to back up
    somewhere else.

    Stop Apple From Scanning Your iPhone Photos - Delete Photos on Server

    While all of your photos should be deleted from Apple's server, we
    should verify that.

    Click on Settings.

    At the top, click on your name.

    Click on iCloud.

    Click on Manage Storage.

    Click on Photos.

    Click on Disable & Delete

    https://discussions.apple.com/thread/254538081?sortBy=rank

    https://www.youtube.com/watch?v=K_i8rTiXTd8

    How to disable Apple scanning your photos in iCloud and on device. The
    new iOS 15 update will scan iPhone photos and alert authorities if any
    of them contain CSAM. Apple Messages also gets an update to scan and
    warn parents if it detects an explicit image being sent or received.
    This video discusses the new Apple update, privacy implications, how to
    disable iPhone photo scanning, and offers a commentary on tech companies
    and the issue of privacy and electronic surveillance.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to Chips Loral on Mon Jul 29 17:14:48 2024
    XPost: alt.privacy

    On 2024-07-29 17:10, Chips Loral wrote:
    Alan wrote:
    On 2024-07-29 15:11, Chips Loral wrote:
    Alan wrote:
    On 2024-07-29 04:23, Andrew wrote:
    Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :

    You not comprehending the difference between zero percent of
    Apple reports
    versus zero total convictions is how I know you zealots own
    subnormal IQs.

    Not at all. My position hasn't changed. You, however, have had
    about three
    different positions on this thread and keep getting confused which >>>>>> one
    you're arguing for. lol.

    Au contraire

    Because I only think logically, my rather sensible position has never >>>>> changed, Chris, and the fact you "think" it has changed is simply
    that you
    don't know the difference between the percentage of convictions
    based on
    the number of reports, and the total number of convictions.

    When you figure out that those two things are different, then (and
    only
    then) will you realize I've maintained the same position throughout. >>>>>
    Specifically....

    a. If the Apple reporting rate is low, and yet if their conviction
        rate is high (based on the number of reports), then they are NOT >>>>>     underreporting images.

    Apple's reporting rate is ZERO, because they're not doing scanning
    of images of any kind.

    After getting caught.

    You can't seem to get ANYTHING right, Mac-troll:

    https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/

    In August 2021, Apple announced a plan to scan photos that users
    stored in iCloud for child sexual abuse material (CSAM). The tool was
    meant to be privacy-preserving and allow the company to flag
    potentially problematic and abusive content without revealing
    anything else. But the initiative was controversial, and it soon drew
    widespread criticism from privacy and security researchers and
    digital rights groups who were concerned that the surveillance
    capability itself could be abused to undermine the privacy and
    security of iCloud users around the world. At the beginning of
    September 2021, Apple said it would pause the rollout of the feature
    to “collect input and make improvements before releasing these
    critically important child safety features.” In other words, a launch
    was still coming.

    Parents and caregivers can opt into the protections through family
    iCloud accounts. The features work in Siri, Apple’s Spotlight search,
    and Safari Search to warn if someone is looking at or searching for
    child sexual abuse materials and provide resources on the spot to
    report the content and seek help.

    https://sneak.berlin/20230115/macos-scans-your-local-files-now/

    Preface: I don’t use iCloud. I don’t use an Apple ID. I don’t use the >>> Mac App Store. I don’t store photos in the macOS “Photos”
    application, even locally. I never opted in to Apple network services
    of any kind - I use macOS software on Apple hardware.

    Today, I was browsing some local images in a subfolder of my
    Documents folder, some HEIC files taken with an iPhone and copied to
    the Mac using the Image Capture program (used for dumping photos from
    an iOS device attached with an USB cable).

    I use a program called Little Snitch which alerts me to network
    traffic attempted by the programs I use. I have all network access
    denied for a lot of Apple OS-level apps because I’m not interested in
    transmitting any of my data whatsoever to Apple over the network -
    mostly because Apple turns over customer data on over 30,000
    customers per year to US federal police without any search warrant
    per Apple’s own self-published transparency report. I’m good without >>> any of that nonsense, thank you.

    Imagine my surprise when browsing these images in the Finder, Little
    Snitch told me that macOS is now connecting to Apple APIs via a
    program named mediaanalysisd (Media Analysis Daemon - a background
    process for analyzing media files).

    ...


    Integrate this data and remember it: macOS now contains network-based
    spyware even with all Apple services disabled. It cannot be disabled
    via controls within the OS: you must used third party network
    filtering software (or external devices) to prevent it.

    This was observed on the current version of macOS, macOS Ventura 13.1.


    'A recent thread on Twitter raised concerns that the macOS process
    mediaanalysisd, which scans local photos, was secretly sending the
    results to an Apple server. This claim was made by a cybersecurity
    researcher named Jeffrey Paul. However, after conducting a thorough
    analysis of the process, it has been determined that this is not the
    case.'



    Bullshit.

    https://www.majorgeeks.com/content/page/stop_apple_scanning_iphone_photos.html

    Apple’s new iPhone photo-scanning feature is a very controversial thing. You might want to consider the only current option to stop Apple from scanning your photos.

    Apple's new photo-scanning feature will scan photos stored in iCloud to
    see whether they match known Child Sexual Abuse Material (CSAM). The
    problem with this, like many others, is that we often have hundreds of
    photos of our children and grandchildren, and who knows how good or bad
    the new software scanning technology is? Apple claims false positives
    are one trillion to one, and there is an appeals process in place. That
    said, one mistake from this AI, just one, could have an innocent person
    sent to jail and their lives destroyed.

    Apple has many other features as part of these upgrades to protect
    children, and we like them all, but photo-scanning sounds like a problem waiting to happen.

    Here are all of the "features" that come with anti-CSAM, expected to
    roll out with iOS 15 in the fall of 2021.

    Messages: The Messages app will use on-device machine learning to warn children and parents about sensitive content.

    iCloud Photos: Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes.

    Siri and Search: Siri and Search will provide additional resources to
    help children and parents stay safe online and get help with unsafe situations.

    Now that you understand how anti-CSAM works, the only way to avoid
    having your photos scanned by this system is to disable iCloud Photos.
    Your photos are scanned when you automatically upload your photos to the cloud, so the only current way to avoid having them scanned is not to
    upload them.

    This adds an interesting problem. The majority of iPhone users use
    iCloud to back up their photos (and everything else). If you disable
    iCloud, you will need to back up your photos manually. If you have a PC
    or Mac, you can always copy them to your computer and back them up. You
    can also consider using another cloud service for backups.

    Let's talk about disabling iCloud and also removing any photos you
    already have uploaded. You will have 30 days to recover your photos if
    you change your mind. Any photos that are on your iPhone when iOS 15 is released will be scanned.

    You'll want to backup and disable iCloud, then verify that no photos
    were left on their servers.

    Stop Apple From Scanning Your iPhone Photos - Back-Up Photos and Disable iCloud Photos

    First, we can disable the uploading of iCloud photos while keeping all
    other backups, including your contacts, calendars, notes, and more.

    Click on Settings.

    At the top, click on your name.

    Click on iCloud.

    Click on Photos.

    Uncheck iCloud Photos.

    You will be prompted to decide what to do with your current photos.

    If you have the space on your phone, you can click on Download Photos & Videos, and your photos will all be on your iPhone, ready to back up somewhere else.

    Stop Apple From Scanning Your iPhone Photos - Delete Photos on Server

    While all of your photos should be deleted from Apple's server, we
    should verify that.

    Click on Settings.

    At the top, click on your name.

    Click on iCloud.

    Click on Manage Storage.

    Click on Photos.

    Click on Disable & Delete

    https://discussions.apple.com/thread/254538081?sortBy=rank

    https://www.youtube.com/watch?v=K_i8rTiXTd8

    How to disable Apple scanning your photos in iCloud and on device. The
    new iOS 15 update will scan iPhone photos and alert authorities if any
    of them contain CSAM. Apple Messages also gets an update to scan and
    warn parents if it detects an explicit image being sent or received.
    This video discusses the new Apple update, privacy implications, how to disable iPhone photo scanning, and offers a commentary on tech companies
    and the issue of privacy and electronic surveillance.


    That discusses a system that Apple disabled.

    And doesn't support your first source AT ALL.

    'Mysk:

    No, macOS doesn’t send info about your local photos to Apple We analyzed mediaanalysisd after an extraordinary claim by Jeffrey Paul that it
    scans local photos and secretly sends the results to an Apple server.

    […]

    We analyzed the network traffic sent and received by mediaanalysisd.
    Well, the call is literally empty. We decrypted it. No headers, no IDs, nothing. Just a simple GET request to this endpoint that returns
    nothing. Honestly, it looks like it is a bug.

    Mysk:

    The issue was indeed a bug and it has been fixed in macOS 13.2. The
    process no longer makes calls to Apple servers.'

    <https://mjtsai.com/blog/2023/01/25/network-connections-from-mediaanalysisd/>

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Alan on Tue Jul 30 01:41:02 2024
    XPost: alt.privacy

    On 2024-07-30, Alan <nuh-uh@nope.com> wrote:
    On 2024-07-29 17:10, Chips Loral wrote:

    Bullshit.

    That discusses a system that Apple disabled.

    A proposal that was never implemented and was shelved.

    And doesn't support your first source AT ALL.

    'Mysk:

    No, macOS doesn’t send info about your local photos to Apple We
    analyzed mediaanalysisd after an extraordinary claim by Jeffrey Paul
    that it scans local photos and secretly sends the results to an Apple
    server.

    […]

    We analyzed the network traffic sent and received by mediaanalysisd.
    Well, the call is literally empty. We decrypted it. No headers, no
    IDs, nothing. Just a simple GET request to this endpoint that returns nothing. Honestly, it looks like it is a bug.

    Mysk:

    The issue was indeed a bug and it has been fixed in macOS 13.2. The
    process no longer makes calls to Apple servers.'

    <https://mjtsai.com/blog/2023/01/25/network-connections-from-mediaanalysisd/>

    Yup. I've had Little Snitch installed on this Mac Studio since I bought
    it, and the Little Snitch Network Monitor has no record of that process
    ever connecting to the internet.

    Chips Loral is either extremely gullible or a simple troll. Either way,
    it's clear he's not interested in factual discourse on this subject.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Alan on Tue Jul 30 01:38:11 2024
    XPost: alt.privacy

    On 2024-07-29, Alan <nuh-uh@nope.com> wrote:
    On 2024-07-29 15:11, Chips Loral wrote:
    Alan wrote:

    Apple's reporting rate is ZERO, because they're not doing scanning
    of images of any kind.

    After getting caught.

    You can't seem to get ANYTHING right, Mac-troll:

    https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/

    'A recent thread on Twitter raised concerns that the macOS process mediaanalysisd, which scans local photos, was secretly sending the
    results to an Apple server. This claim was made by a cybersecurity
    researcher named Jeffrey Paul. However, after conducting a thorough
    analysis of the process, it has been determined that this is not the
    case.'

    <https://pawisoon.medium.com/debunked-the-truth-about-mediaanalysisd-and-apples-access-to-your-local-photos-on-macos-a42215e713d1>

    'The mediaanalysisd process is a background task that starts every
    time an image file is previewed in Finder, and then calls an Apple
    service. The process is designed to run machine learning algorithms
    to detect objects in photos and make object-based search possible in
    the Photos app. It also helps Finder to detect text and QR codes in
    photos. Even if a user does not use the Photos app or have an iCloud
    account, the process will still run.'

    Apple is not scanning your photos for CSAM

    Yup. Jeffrey Paul should be embarrassed and ashamed of himself. He's not
    much of a "hacker and security researcher" if he didn't even bother to
    learn what the process actually does before making outlandish claims
    about it. What a weirdo.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to Chips Loral on Tue Jul 30 10:14:01 2024
    XPost: alt.privacy

    On 2024-07-30 10:01, Chips Loral wrote:
    Alan wrote:
    The issue was indeed a bug


    https://www.majorgeeks.com/content/page/stop_apple_scanning_iphone_photos.html

    Apple’s new iPhone photo-scanning feature is a very controversial thing. You might want to consider the only current option to stop Apple from scanning your photos.

    You're just showing your ignorance now.

    1. There was a proposed Apple system for checking images that were to be uploaded to Apple's iCloud system for photos and videos. That checking
    was going to take place ON THE PHONE and it was only going to compare
    images to KNOWN CSAM images. That system is what your "article" is
    talking about.

    2. That system was never actually implemented.

    3. Long after that, someone noticed a network connection made by a piece
    of software called "mediaanalysisd" (media analysis daemon) to an Apple
    server, but that connection:

    a. Was a GET that never actually sent any information TO the server.

    b. Was clearly a bug, as it was removed during an OS update.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Chips Loral@21:1/5 to Alan on Tue Jul 30 11:01:40 2024
    XPost: alt.privacy

    Alan wrote:
    The issue was indeed a bug


    https://www.majorgeeks.com/content/page/stop_apple_scanning_iphone_photos.html

    Apple’s new iPhone photo-scanning feature is a very controversial thing.
    You might want to consider the only current option to stop Apple from
    scanning your photos.

    Apple's new photo-scanning feature will scan photos stored in iCloud to
    see whether they match known Child Sexual Abuse Material (CSAM). The
    problem with this, like many others, is that we often have hundreds of
    photos of our children and grandchildren, and who knows how good or bad
    the new software scanning technology is? Apple claims false positives
    are one trillion to one, and there is an appeals process in place. That
    said, one mistake from this AI, just one, could have an innocent person
    sent to jail and their lives destroyed.

    Apple has many other features as part of these upgrades to protect
    children, and we like them all, but photo-scanning sounds like a problem waiting to happen.

    Here are all of the "features" that come with anti-CSAM, expected to
    roll out with iOS 15 in the fall of 2021.

    Messages: The Messages app will use on-device machine learning to warn
    children and parents about sensitive content.

    iCloud Photos: Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes.

    Siri and Search: Siri and Search will provide additional resources to
    help children and parents stay safe online and get help with unsafe
    situations.

    Now that you understand how anti-CSAM works, the only way to avoid
    having your photos scanned by this system is to disable iCloud Photos.
    Your photos are scanned when you automatically upload your photos to the
    cloud, so the only current way to avoid having them scanned is not to
    upload them.

    This adds an interesting problem. The majority of iPhone users use
    iCloud to back up their photos (and everything else). If you disable
    iCloud, you will need to back up your photos manually. If you have a PC
    or Mac, you can always copy them to your computer and back them up. You
    can also consider using another cloud service for backups.

    Let's talk about disabling iCloud and also removing any photos you
    already have uploaded. You will have 30 days to recover your photos if
    you change your mind. Any photos that are on your iPhone when iOS 15 is released will be scanned.

    You'll want to backup and disable iCloud, then verify that no photos
    were left on their servers.

    Stop Apple From Scanning Your iPhone Photos - Back-Up Photos and Disable
    iCloud Photos

    First, we can disable the uploading of iCloud photos while keeping all
    other backups, including your contacts, calendars, notes, and more.

    Click on Settings.

    At the top, click on your name.

    Click on iCloud.

    Click on Photos.

    Uncheck iCloud Photos.

    You will be prompted to decide what to do with your current photos.

    If you have the space on your phone, you can click on Download Photos &
    Videos, and your photos will all be on your iPhone, ready to back up
    somewhere else.

    Stop Apple From Scanning Your iPhone Photos - Delete Photos on Server

    While all of your photos should be deleted from Apple's server, we
    should verify that.

    Click on Settings.

    At the top, click on your name.

    Click on iCloud.

    Click on Manage Storage.

    Click on Photos.

    Click on Disable & Delete

    https://discussions.apple.com/thread/254538081?sortBy=rank

    https://www.youtube.com/watch?v=K_i8rTiXTd8

    How to disable Apple scanning your photos in iCloud and on device. The
    new iOS 15 update will scan iPhone photos and alert authorities if any
    of them contain CSAM. Apple Messages also gets an update to scan and
    warn parents if it detects an explicit image being sent or received.
    This video discusses the new Apple update, privacy implications, how to
    disable iPhone photo scanning, and offers a commentary on tech companies
    and the issue of privacy and electronic surveillance.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Jolly Roger@21:1/5 to Andrew on Fri Jul 26 22:24:28 2024
    XPost: alt.privacy

    On 2024-07-26, Andrew <andrew@spam.net> wrote:
    Jolly Roger wrote on 25 Jul 2024 21:12:36 GMT :

    The cite Chris listed said NOTHING whatsoever about the conviction
    rate.

    It certainly shows the conviction rate is higher than your claimed
    "absolute zero". So it seems it was you who lied first. I realize you
    want us to ignore this, but your focus on who lied demands we
    recognize it.

    zealots lack a normal IQ

    Straight to the insults.

    You both fabricated imaginary convictions

    Actually, both Chris and I provided actual documentation of actual
    convictions.

    You lied.

    No, you're projecting.

    Just showing convictions is completely meaningless to this point.

    No, it's entirely meaningful and disputes your bogus claim that there
    have been "absolutely zero" convictions.

    We're done here.

    --
    E-mail sent to this address may be devoured by my ravenous SPAM filter.
    I often ignore posts from Google. Use a real news client instead.

    JR

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Alan@21:1/5 to Jolly Roger on Fri Jul 26 16:07:57 2024
    On 2024-07-26 15:14, Jolly Roger wrote:
    On 2024-07-26, Alan <nuh-uh@nope.com> wrote:
    On 2024-07-26 09:11, Jolly Roger wrote:
    On 2024-07-26, Chris <ithinkiam@gmail.com> wrote:
    On 24/07/2024 22:35, Jolly Roger wrote:
    On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
    Andrew <andrew@spam.net> wrote:
    Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :

    The NSPCC should really be complaining at how ineffectual the
    tech companies are rather than complain at Apple for not sending >>>>>>>> millions of photos to already overwhelmed authorities.

    For all that is in the news stories, it could be ZERO convictions >>>>>>> resulted.

    Think about that.

    Is it worth everyone's loss of privacy for maybe zero gain in
    child safety?

    Apple's solution wouldn't have resulted in any additional loss of
    privacy

    Actually, Apple could not guarantee that, and there was a non-zero
    chance that false positive matches would result in privacy
    violations.

    True. The balance of risk was proportionate, however. Much moreso
    than the current system.

    Absolutely. I'm just of the opinion if one innocent person is harmed,
    that's one too many. Would you want to be that unlucky innocent
    person who has to deal with charges, a potential criminal sexual
    violation on your record, and all that comes with it? I certainly
    wouldn't.

    Except that Apple's system wouldn't automatically trigger charges.

    An actual human would review the images in question...

    And at that point, someone's privacy may be violated. Do you want a
    stranger looking at photos of your sick child? What if that stranger
    came to the conclusion that those photos are somehow classifiable as
    sexual or abusive in some way? Would you want to have to argue your case
    in court because of it?

    Yes. At that point...

    ...if and only if the person is INNOCENT...

    ...someone's privacy is unnecessarily violated.

    And it's a stretch to imagine that:

    1. Innocent pictures would be matched with KNOWN CSAM images, AND;

    (the logical AND)

    2. A person reviewing those images after they've been flagged wouldn't
    notice they don't actually match; AND

    3. The owner of those images at that point would be charged when they
    could then show that they were in fact innocent images.

    All three of those things have to happen before this would ever wind up
    in court.


    ...AND since they were comparing images against KNOWN CSAM, false
    positives would naturally be very few to begin with.

    Yes, but one is one too many in my book.

    And yet you are fine with innocent people's privacy being violated when
    a search warrant is issued erroneously.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)