Sysop: | Amessyroom |
---|---|
Location: | Fayetteville, NC |
Users: | 42 |
Nodes: | 6 (0 / 6) |
Uptime: | 01:35:20 |
Calls: | 220 |
Calls today: | 1 |
Files: | 824 |
Messages: | 121,542 |
Posted today: | 6 |
Jolly Roger wrote on 28 Jul 2024 16:16:59 GMT :
Liar - here are your own words, little Arlen, where you say "absolutely
zero" were caught and convicted:
You not comprehending
Given, for all we know, absolutely zero pedophiles were caught and
convicted by the Meta & Google (and even Apple) system, the safety
gained is zero.
Jolly Roger wrote on 28 Jul 2024 16:21:06 GMT :
You don't remember your own words, where you claimed there have been
"absolutely zero" pedophiles caught and convicted as a result of CSAM
scanning? Here, let's refresh your rotten memory:
You not comprehending the difference between zero percent of Apple
reports versus zero total convictions is how I know you zealots own
subnormal IQs.
Given, for all we know, absolutely zero pedophiles were caught and
convicted by the Meta & Google (and even Apple) system, the safety
gained is zero.
Jolly Roger <jollyroger@pobox.com> wrote:
On 2024-07-26, Alan <nuh-uh@nope.com> wrote:
On 2024-07-26 09:11, Jolly Roger wrote:
On 2024-07-26, Chris <ithinkiam@gmail.com> wrote:
On 24/07/2024 22:35, Jolly Roger wrote:
On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
Andrew <andrew@spam.net> wrote:
Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :
The NSPCC should really be complaining at how ineffectual the >>>>>>>>> tech companies are rather than complain at Apple for notFor all that is in the news stories, it could be ZERO
sending millions of photos to already overwhelmed authorities. >>>>>>>>
convictions resulted.
Think about that.
Is it worth everyone's loss of privacy for maybe zero gain in
child safety?
Apple's solution wouldn't have resulted in any additional loss
of privacy
Actually, Apple could not guarantee that, and there was a
non-zero chance that false positive matches would result in
privacy violations.
True. The balance of risk was proportionate, however. Much moreso
than the current system.
Absolutely. I'm just of the opinion if one innocent person is
harmed, that's one too many. Would you want to be that unlucky
innocent person who has to deal with charges, a potential criminal
sexual violation on your record, and all that comes with it? I
certainly wouldn't.
Except that Apple's system wouldn't automatically trigger charges.
An actual human would review the images in question...
And at that point, someone's privacy may be violated.
You're entering into confucious territory. If nothing is triggered is anyone's privacy infringed.
Do you want a stranger looking at photos of your sick child?
That wouldn't happen with Apple's method.
What if that stranger came to the conclusion that those photos are
somehow classifiable as sexual or abusive in some way? Would you want
to have to argue your case in court because of it?
That's a lot of ifs and steps.
No-one is going to be charged for a dubious
photo of their own child. There are much bigger fish to fry and get into jail.
...AND since they were comparing images against KNOWN CSAM, false
positives would naturally be very few to begin with.
Yes, but one is one too many in my book.
How many children are you prepared to be abused to protect YOUR
privacy?
Apple was wise to shelve this proposal
I think you need to have a lie down. You literally are making no sense anymore.
You not comprehending the difference between zero percent of Apple reports >> versus zero total convictions is how I know you zealots own subnormal IQs.
Not at all. My position hasn't changed. You, however, have had about three different positions on this thread and keep getting confused which one
you're arguing for. lol.
Jolly Roger <jollyroger@pobox.com> wrote:
On 2024-07-28, Chris <ithinkiam@gmail.com> wrote:
No-one is going to be charged for a dubious photo of their own
child. There are much bigger fish to fry and get into jail.
You're wrong. It has already happened:
A Dad Took Photos of His Naked Toddler for the Doctor. Google Flagged
Him as a Criminal <https://archive.is/78Pla#selection-563.0-1075.217>
I explicitly said "charged". No-one got charged. The law is working
just fine. It's the tech, as I've been arguing all along, that's the
problem.
Read the whole article to get a glimpse of what innocent people go
through who fall victim to this invasive scanning.
Do you think these parents and their child consider their privacy to
be violated? How would you feel if your intimate photos were added to
the PhotoDNA CSAM database because they were incorrectly flagged?
This wasn't PhotoDNA, which is what Apple was similar to. It was
google's AI method that is designed to "recognize never-before-seen exploitative images of children" which is where the real danger sits.
It is designed to identify new abuse images based on only the pixel
data so all hits will be massively enriched for things that look like
abuse. A human won't have the ability to accurately identify the
(likely innocent) motivation for taking photo and "to be safe" will
pass it onto someone else make the decision i.e. law enforcement. The
LE will have access to much more information and see it's an obvious
mistake as seen in your article.
Apple's system was more like hashing the image data and comparing
hashes where false positives are due to algorithmic randomness. The
pixel data when viewed by a human won't be anything like CSAM and an
easy decision made.
What's crucial here is that Google are looking for new stuff - which
is always problematic - whereas Apple's was not. The search space when looking for existing images is much tinier and the impact of false
positives much, much smaller.
Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :
You not comprehending the difference between zero percent of Apple reports >>> versus zero total convictions is how I know you zealots own subnormal IQs. >>Not at all. My position hasn't changed. You, however, have had about three >> different positions on this thread and keep getting confused which one
you're arguing for. lol.
Au contraire
Because I only think logically, my rather sensible position has never changed, Chris, and the fact you "think" it has changed is simply that you don't know the difference between the percentage of convictions based on
the number of reports, and the total number of convictions.
When you figure out that those two things are different, then (and only
then) will you realize I've maintained the same position throughout.
Specifically....
a. If the Apple reporting rate is low, and yet if their conviction
rate is high (based on the number of reports), then they are NOT
underreporting images.
Chris <ithinkiam@gmail.com> wrote:
Jolly Roger <jollyroger@pobox.com> wrote:
On 2024-07-29, Chris <ithinkiam@gmail.com> wrote:
Actually, a human being does review it with Google's system:
I was unclear. I'm not saying a human doesn't review, I'm saying that given >> the dozens/hundreds of images of suspected abuse images they review a day
they won't have the ability to make informed decisions.
---
A human content moderator for Google would have reviewed the photos
after they were flagged by the artificial intelligence to confirm they
met the federal definition of child sexual abuse material.
What kind of a person would want this job?
On 29/07/2024 21:04, badgolferman wrote:
Chris <ithinkiam@gmail.com> wrote:
Jolly Roger <jollyroger@pobox.com> wrote:
On 2024-07-29, Chris <ithinkiam@gmail.com> wrote:
Actually, a human being does review it with Google's system:
I was unclear. I'm not saying a human doesn't review, I'm saying that
given
the dozens/hundreds of images of suspected abuse images they review a
day
they won't have the ability to make informed decisions.
---
A human content moderator for Google would have reviewed the photos
after they were flagged by the artificial intelligence to confirm they >>>> met the federal definition of child sexual abuse material.
What kind of a person would want this job?
I read an article a couple of years ago on the Facebook content
moderators. Many ended up traumatised and got no support. God it was a
grim read.
On 2024-07-29 04:23, Andrew wrote:
Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :
You not comprehending the difference between zero percent of Apple
reports
versus zero total convictions is how I know you zealots own
subnormal IQs.
Not at all. My position hasn't changed. You, however, have had about
three
different positions on this thread and keep getting confused which one
you're arguing for. lol.
Au contraire
Because I only think logically, my rather sensible position has never
changed, Chris, and the fact you "think" it has changed is simply that
you
don't know the difference between the percentage of convictions based on
the number of reports, and the total number of convictions.
When you figure out that those two things are different, then (and only
then) will you realize I've maintained the same position throughout.
Specifically....
a. If the Apple reporting rate is low, and yet if their conviction
rate is high (based on the number of reports), then they are NOT
underreporting images.
Apple's reporting rate is ZERO, because they're not doing scanning of
images of any kind.
Alan wrote:
On 2024-07-29 04:23, Andrew wrote:
Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :
You not comprehending the difference between zero percent of Apple
reports
versus zero total convictions is how I know you zealots own
subnormal IQs.
Not at all. My position hasn't changed. You, however, have had about
three
different positions on this thread and keep getting confused which one >>>> you're arguing for. lol.
Au contraire
Because I only think logically, my rather sensible position has never
changed, Chris, and the fact you "think" it has changed is simply
that you
don't know the difference between the percentage of convictions based on >>> the number of reports, and the total number of convictions.
When you figure out that those two things are different, then (and only
then) will you realize I've maintained the same position throughout.
Specifically....
a. If the Apple reporting rate is low, and yet if their conviction
rate is high (based on the number of reports), then they are NOT
underreporting images.
Apple's reporting rate is ZERO, because they're not doing scanning of
images of any kind.
After getting caught.
You can't seem to get ANYTHING right, Mac-troll:
https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/
In August 2021, Apple announced a plan to scan photos that users stored
in iCloud for child sexual abuse material (CSAM). The tool was meant to
be privacy-preserving and allow the company to flag potentially
problematic and abusive content without revealing anything else. But the initiative was controversial, and it soon drew widespread criticism from privacy and security researchers and digital rights groups who were
concerned that the surveillance capability itself could be abused to undermine the privacy and security of iCloud users around the world. At
the beginning of September 2021, Apple said it would pause the rollout
of the feature to “collect input and make improvements before releasing these critically important child safety features.” In other words, a
launch was still coming.
Parents and caregivers can opt into the protections through family
iCloud accounts. The features work in Siri, Apple’s Spotlight search,
and Safari Search to warn if someone is looking at or searching for
child sexual abuse materials and provide resources on the spot to report
the content and seek help.
https://sneak.berlin/20230115/macos-scans-your-local-files-now/
Preface: I don’t use iCloud. I don’t use an Apple ID. I don’t use the Mac App Store. I don’t store photos in the macOS “Photos” application, even locally. I never opted in to Apple network services of any kind - I
use macOS software on Apple hardware.
Today, I was browsing some local images in a subfolder of my Documents folder, some HEIC files taken with an iPhone and copied to the Mac using
the Image Capture program (used for dumping photos from an iOS device attached with an USB cable).
I use a program called Little Snitch which alerts me to network traffic attempted by the programs I use. I have all network access denied for a
lot of Apple OS-level apps because I’m not interested in transmitting
any of my data whatsoever to Apple over the network - mostly because
Apple turns over customer data on over 30,000 customers per year to US federal police without any search warrant per Apple’s own self-published transparency report. I’m good without any of that nonsense, thank you.
Imagine my surprise when browsing these images in the Finder, Little
Snitch told me that macOS is now connecting to Apple APIs via a program
named mediaanalysisd (Media Analysis Daemon - a background process for analyzing media files).
...
Integrate this data and remember it: macOS now contains network-based
spyware even with all Apple services disabled. It cannot be disabled via controls within the OS: you must used third party network filtering
software (or external devices) to prevent it.
This was observed on the current version of macOS, macOS Ventura 13.1.
Jolly Roger <jollyroger@pobox.com> wrote:
Yes, but even in Apple's case, there's a small change of a false
positive patch. And were that to happen, there is a danger of an
innocent person's privacy being violated.
In every case there's a chance of FPs. Apple would have had lower FPR then *the current* system.
Given the choice I'm in favour of the better, evidence-based method.
You're in favour of the worse system
Nope. I don't support any scanning of private content.
Yet it's already happening so why not support the better method?
I agree - still not good enough for me though.
"Perfect is the enemy of the good"
By seeking perfection you and others are allowing and enabling child
abuse.
Apple only shelved it for PR reasons, which is a real shame.
You don't know all of Apple's motivations. What we know is Apple shelved
it after gathering feedback from industry experts. And many of those
experts were of the opinion that even with Apple's precautions, the risk
of violating people's privacy was too great.
That wasn't the consensus. The noisy tin-foil brigade drowned out any possible discussion. i
Apple should have sim
Chris <ithinkiam@gmail.com> wrote:
Jolly Roger <jollyroger@pobox.com> wrote:
On 2024-07-29, Chris <ithinkiam@gmail.com> wrote:
Actually, a human being does review it with Google's system:
I was unclear. I'm not saying a human doesn't review, I'm saying that
given the dozens/hundreds of images of suspected abuse images they
review a day they won't have the ability to make informed decisions.
--- A human content moderator for Google would have reviewed the
photos after they were flagged by the artificial intelligence to
confirm they met the federal definition of child sexual abuse
material.
What kind of a person would want this job?
On 2024-07-29 04:23, Andrew wrote:
Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :
You not comprehending the difference between zero percent of Apple reports >>>> versus zero total convictions is how I know you zealots own subnormal IQs. >>>Not at all. My position hasn't changed. You, however, have had about three >>> different positions on this thread and keep getting confused which one
you're arguing for. lol.
Au contraire
Because I only think logically, my rather sensible position has never
changed, Chris, and the fact you "think" it has changed is simply that you >> don't know the difference between the percentage of convictions based on
the number of reports, and the total number of convictions.
When you figure out that those two things are different, then (and only
then) will you realize I've maintained the same position throughout.
Specifically....
a. If the Apple reporting rate is low, and yet if their conviction
rate is high (based on the number of reports), then they are NOT
underreporting images.
Apple's reporting rate is ZERO, because they're not doing scanning of
images of any kind.
On 2024-07-29 15:11, Chips Loral wrote:
Alan wrote:
On 2024-07-29 04:23, Andrew wrote:
Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :
You not comprehending the difference between zero percent of Apple >>>>>> reports
versus zero total convictions is how I know you zealots own
subnormal IQs.
Not at all. My position hasn't changed. You, however, have had
about three
different positions on this thread and keep getting confused which one >>>>> you're arguing for. lol.
Au contraire
Because I only think logically, my rather sensible position has never
changed, Chris, and the fact you "think" it has changed is simply
that you
don't know the difference between the percentage of convictions
based on
the number of reports, and the total number of convictions.
When you figure out that those two things are different, then (and only >>>> then) will you realize I've maintained the same position throughout.
Specifically....
a. If the Apple reporting rate is low, and yet if their conviction
rate is high (based on the number of reports), then they are NOT >>>> underreporting images.
Apple's reporting rate is ZERO, because they're not doing scanning of
images of any kind.
After getting caught.
You can't seem to get ANYTHING right, Mac-troll:
https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/
In August 2021, Apple announced a plan to scan photos that users
stored in iCloud for child sexual abuse material (CSAM). The tool was
meant to be privacy-preserving and allow the company to flag
potentially problematic and abusive content without revealing anything
else. But the initiative was controversial, and it soon drew
widespread criticism from privacy and security researchers and digital
rights groups who were concerned that the surveillance capability
itself could be abused to undermine the privacy and security of iCloud
users around the world. At the beginning of September 2021, Apple said
it would pause the rollout of the feature to “collect input and make
improvements before releasing these critically important child safety
features.” In other words, a launch was still coming.
Parents and caregivers can opt into the protections through family
iCloud accounts. The features work in Siri, Apple’s Spotlight search,
and Safari Search to warn if someone is looking at or searching for
child sexual abuse materials and provide resources on the spot to
report the content and seek help.
https://sneak.berlin/20230115/macos-scans-your-local-files-now/
Preface: I don’t use iCloud. I don’t use an Apple ID. I don’t use the >> Mac App Store. I don’t store photos in the macOS “Photos” application, >> even locally. I never opted in to Apple network services of any kind -
I use macOS software on Apple hardware.
Today, I was browsing some local images in a subfolder of my Documents
folder, some HEIC files taken with an iPhone and copied to the Mac
using the Image Capture program (used for dumping photos from an iOS
device attached with an USB cable).
I use a program called Little Snitch which alerts me to network
traffic attempted by the programs I use. I have all network access
denied for a lot of Apple OS-level apps because I’m not interested in
transmitting any of my data whatsoever to Apple over the network -
mostly because Apple turns over customer data on over 30,000 customers
per year to US federal police without any search warrant per Apple’s
own self-published transparency report. I’m good without any of that
nonsense, thank you.
Imagine my surprise when browsing these images in the Finder, Little
Snitch told me that macOS is now connecting to Apple APIs via a
program named mediaanalysisd (Media Analysis Daemon - a background
process for analyzing media files).
...
Integrate this data and remember it: macOS now contains network-based
spyware even with all Apple services disabled. It cannot be disabled
via controls within the OS: you must used third party network
filtering software (or external devices) to prevent it.
This was observed on the current version of macOS, macOS Ventura 13.1.
'A recent thread on Twitter raised concerns that the macOS process mediaanalysisd, which scans local photos, was secretly sending the
results to an Apple server. This claim was made by a cybersecurity
researcher named Jeffrey Paul. However, after conducting a thorough
analysis of the process, it has been determined that this is not the case.'
Alan wrote:
On 2024-07-29 15:11, Chips Loral wrote:
Alan wrote:
On 2024-07-29 04:23, Andrew wrote:
Chris wrote on Mon, 29 Jul 2024 06:50:53 -0000 (UTC) :
You not comprehending the difference between zero percent of
Apple reports
versus zero total convictions is how I know you zealots own
subnormal IQs.
Not at all. My position hasn't changed. You, however, have had
about three
different positions on this thread and keep getting confused which >>>>>> one
you're arguing for. lol.
Au contraire
Because I only think logically, my rather sensible position has never >>>>> changed, Chris, and the fact you "think" it has changed is simply
that you
don't know the difference between the percentage of convictions
based on
the number of reports, and the total number of convictions.
When you figure out that those two things are different, then (and
only
then) will you realize I've maintained the same position throughout. >>>>>
Specifically....
a. If the Apple reporting rate is low, and yet if their conviction
rate is high (based on the number of reports), then they are NOT >>>>> underreporting images.
Apple's reporting rate is ZERO, because they're not doing scanning
of images of any kind.
After getting caught.
You can't seem to get ANYTHING right, Mac-troll:
https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/
In August 2021, Apple announced a plan to scan photos that users
stored in iCloud for child sexual abuse material (CSAM). The tool was
meant to be privacy-preserving and allow the company to flag
potentially problematic and abusive content without revealing
anything else. But the initiative was controversial, and it soon drew
widespread criticism from privacy and security researchers and
digital rights groups who were concerned that the surveillance
capability itself could be abused to undermine the privacy and
security of iCloud users around the world. At the beginning of
September 2021, Apple said it would pause the rollout of the feature
to “collect input and make improvements before releasing these
critically important child safety features.” In other words, a launch
was still coming.
Parents and caregivers can opt into the protections through family
iCloud accounts. The features work in Siri, Apple’s Spotlight search,
and Safari Search to warn if someone is looking at or searching for
child sexual abuse materials and provide resources on the spot to
report the content and seek help.
https://sneak.berlin/20230115/macos-scans-your-local-files-now/
Preface: I don’t use iCloud. I don’t use an Apple ID. I don’t use the >>> Mac App Store. I don’t store photos in the macOS “Photos”
application, even locally. I never opted in to Apple network services
of any kind - I use macOS software on Apple hardware.
Today, I was browsing some local images in a subfolder of my
Documents folder, some HEIC files taken with an iPhone and copied to
the Mac using the Image Capture program (used for dumping photos from
an iOS device attached with an USB cable).
I use a program called Little Snitch which alerts me to network
traffic attempted by the programs I use. I have all network access
denied for a lot of Apple OS-level apps because I’m not interested in
transmitting any of my data whatsoever to Apple over the network -
mostly because Apple turns over customer data on over 30,000
customers per year to US federal police without any search warrant
per Apple’s own self-published transparency report. I’m good without >>> any of that nonsense, thank you.
Imagine my surprise when browsing these images in the Finder, Little
Snitch told me that macOS is now connecting to Apple APIs via a
program named mediaanalysisd (Media Analysis Daemon - a background
process for analyzing media files).
...
Integrate this data and remember it: macOS now contains network-based
spyware even with all Apple services disabled. It cannot be disabled
via controls within the OS: you must used third party network
filtering software (or external devices) to prevent it.
This was observed on the current version of macOS, macOS Ventura 13.1.
'A recent thread on Twitter raised concerns that the macOS process
mediaanalysisd, which scans local photos, was secretly sending the
results to an Apple server. This claim was made by a cybersecurity
researcher named Jeffrey Paul. However, after conducting a thorough
analysis of the process, it has been determined that this is not the
case.'
Bullshit.
https://www.majorgeeks.com/content/page/stop_apple_scanning_iphone_photos.html
Apple’s new iPhone photo-scanning feature is a very controversial thing. You might want to consider the only current option to stop Apple from scanning your photos.
Apple's new photo-scanning feature will scan photos stored in iCloud to
see whether they match known Child Sexual Abuse Material (CSAM). The
problem with this, like many others, is that we often have hundreds of
photos of our children and grandchildren, and who knows how good or bad
the new software scanning technology is? Apple claims false positives
are one trillion to one, and there is an appeals process in place. That
said, one mistake from this AI, just one, could have an innocent person
sent to jail and their lives destroyed.
Apple has many other features as part of these upgrades to protect
children, and we like them all, but photo-scanning sounds like a problem waiting to happen.
Here are all of the "features" that come with anti-CSAM, expected to
roll out with iOS 15 in the fall of 2021.
Messages: The Messages app will use on-device machine learning to warn children and parents about sensitive content.
iCloud Photos: Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes.
Siri and Search: Siri and Search will provide additional resources to
help children and parents stay safe online and get help with unsafe situations.
Now that you understand how anti-CSAM works, the only way to avoid
having your photos scanned by this system is to disable iCloud Photos.
Your photos are scanned when you automatically upload your photos to the cloud, so the only current way to avoid having them scanned is not to
upload them.
This adds an interesting problem. The majority of iPhone users use
iCloud to back up their photos (and everything else). If you disable
iCloud, you will need to back up your photos manually. If you have a PC
or Mac, you can always copy them to your computer and back them up. You
can also consider using another cloud service for backups.
Let's talk about disabling iCloud and also removing any photos you
already have uploaded. You will have 30 days to recover your photos if
you change your mind. Any photos that are on your iPhone when iOS 15 is released will be scanned.
You'll want to backup and disable iCloud, then verify that no photos
were left on their servers.
Stop Apple From Scanning Your iPhone Photos - Back-Up Photos and Disable iCloud Photos
First, we can disable the uploading of iCloud photos while keeping all
other backups, including your contacts, calendars, notes, and more.
Click on Settings.
At the top, click on your name.
Click on iCloud.
Click on Photos.
Uncheck iCloud Photos.
You will be prompted to decide what to do with your current photos.
If you have the space on your phone, you can click on Download Photos & Videos, and your photos will all be on your iPhone, ready to back up somewhere else.
Stop Apple From Scanning Your iPhone Photos - Delete Photos on Server
While all of your photos should be deleted from Apple's server, we
should verify that.
Click on Settings.
At the top, click on your name.
Click on iCloud.
Click on Manage Storage.
Click on Photos.
Click on Disable & Delete
https://discussions.apple.com/thread/254538081?sortBy=rank
https://www.youtube.com/watch?v=K_i8rTiXTd8
How to disable Apple scanning your photos in iCloud and on device. The
new iOS 15 update will scan iPhone photos and alert authorities if any
of them contain CSAM. Apple Messages also gets an update to scan and
warn parents if it detects an explicit image being sent or received.
This video discusses the new Apple update, privacy implications, how to disable iPhone photo scanning, and offers a commentary on tech companies
and the issue of privacy and electronic surveillance.
On 2024-07-29 17:10, Chips Loral wrote:
Bullshit.
That discusses a system that Apple disabled.
And doesn't support your first source AT ALL.
'Mysk:
No, macOS doesn’t send info about your local photos to Apple We
analyzed mediaanalysisd after an extraordinary claim by Jeffrey Paul
that it scans local photos and secretly sends the results to an Apple
server.
[…]
We analyzed the network traffic sent and received by mediaanalysisd.
Well, the call is literally empty. We decrypted it. No headers, no
IDs, nothing. Just a simple GET request to this endpoint that returns nothing. Honestly, it looks like it is a bug.
Mysk:
The issue was indeed a bug and it has been fixed in macOS 13.2. The
process no longer makes calls to Apple servers.'
<https://mjtsai.com/blog/2023/01/25/network-connections-from-mediaanalysisd/>
On 2024-07-29 15:11, Chips Loral wrote:
Alan wrote:
Apple's reporting rate is ZERO, because they're not doing scanning
of images of any kind.
After getting caught.
You can't seem to get ANYTHING right, Mac-troll:
https://www.wired.com/story/apple-photo-scanning-csam-communication-safety-messages/
'A recent thread on Twitter raised concerns that the macOS process mediaanalysisd, which scans local photos, was secretly sending the
results to an Apple server. This claim was made by a cybersecurity
researcher named Jeffrey Paul. However, after conducting a thorough
analysis of the process, it has been determined that this is not the
case.'
<https://pawisoon.medium.com/debunked-the-truth-about-mediaanalysisd-and-apples-access-to-your-local-photos-on-macos-a42215e713d1>
'The mediaanalysisd process is a background task that starts every
time an image file is previewed in Finder, and then calls an Apple
service. The process is designed to run machine learning algorithms
to detect objects in photos and make object-based search possible in
the Photos app. It also helps Finder to detect text and QR codes in
photos. Even if a user does not use the Photos app or have an iCloud
account, the process will still run.'
Apple is not scanning your photos for CSAM
Alan wrote:
The issue was indeed a bug
https://www.majorgeeks.com/content/page/stop_apple_scanning_iphone_photos.html
Apple’s new iPhone photo-scanning feature is a very controversial thing. You might want to consider the only current option to stop Apple from scanning your photos.
The issue was indeed a bug
Jolly Roger wrote on 25 Jul 2024 21:12:36 GMT :
The cite Chris listed said NOTHING whatsoever about the conviction
rate.
It certainly shows the conviction rate is higher than your claimed
"absolute zero". So it seems it was you who lied first. I realize you
want us to ignore this, but your focus on who lied demands we
recognize it.
zealots lack a normal IQ
You both fabricated imaginary convictions
You lied.
Just showing convictions is completely meaningless to this point.
On 2024-07-26, Alan <nuh-uh@nope.com> wrote:
On 2024-07-26 09:11, Jolly Roger wrote:
On 2024-07-26, Chris <ithinkiam@gmail.com> wrote:
On 24/07/2024 22:35, Jolly Roger wrote:
On 2024-07-24, Chris <ithinkiam@gmail.com> wrote:
Andrew <andrew@spam.net> wrote:
Chris wrote on Wed, 24 Jul 2024 07:20:19 -0000 (UTC) :
The NSPCC should really be complaining at how ineffectual the
tech companies are rather than complain at Apple for not sending >>>>>>>> millions of photos to already overwhelmed authorities.
For all that is in the news stories, it could be ZERO convictions >>>>>>> resulted.
Think about that.
Is it worth everyone's loss of privacy for maybe zero gain in
child safety?
Apple's solution wouldn't have resulted in any additional loss of
privacy
Actually, Apple could not guarantee that, and there was a non-zero
chance that false positive matches would result in privacy
violations.
True. The balance of risk was proportionate, however. Much moreso
than the current system.
Absolutely. I'm just of the opinion if one innocent person is harmed,
that's one too many. Would you want to be that unlucky innocent
person who has to deal with charges, a potential criminal sexual
violation on your record, and all that comes with it? I certainly
wouldn't.
Except that Apple's system wouldn't automatically trigger charges.
An actual human would review the images in question...
And at that point, someone's privacy may be violated. Do you want a
stranger looking at photos of your sick child? What if that stranger
came to the conclusion that those photos are somehow classifiable as
sexual or abusive in some way? Would you want to have to argue your case
in court because of it?
...AND since they were comparing images against KNOWN CSAM, false
positives would naturally be very few to begin with.
Yes, but one is one too many in my book.