• Breaking News

    Sunday, August 8, 2021

    Apple Daily Advice Thread

    Apple Daily Advice Thread


    Daily Advice Thread

    Posted: 07 Aug 2021 03:00 AM PDT

    Welcome to the Daily Advice Thread for /r/Apple. This thread can be used to ask for technical advice regarding Apple software and hardware, to ask questions regarding the buying or selling of Apple products or to post other short questions.

    Have a question you need answered? Ask away! Please remember to adhere to our rules, which can be found in the sidebar. On mobile? Here is a screenshot with our rules.

    Join our Discord and IRC chat rooms for support:

    Note: Comments are sorted by /new for your convenience.

    Here is an archive of all previous Daily Advice Threads. This is best viewed on a browser. If on mobile, type in the search bar [author:"AutoModerator" title:"Daily Advice Thread" or title:"Daily Tech Support Thread"] (without the brackets, and including the quotation marks around the titles and author.)

    The Daily Advice Thread is posted each day at 06:00 AM EST (Click HERE for other timezones) and then the old one is archived. It is advised to wait for the new thread to post your question if this time is nearing.

    submitted by /u/AutoModerator
    [link] [comments]

    People should not stop talking about the privacy violations being thrust down our throats under the guise of “protecting the children”

    Posted: 07 Aug 2021 04:39 PM PDT

    Many times when something like this happens people get upset for a week and then go on with their lives because they either stop caring or think it won't matter.

    This is a hill worth dying on for anyone who cares about privacy and freedom from intrusion.

    I think everyone at this point realizes the insane precedent this sets and if Apple is able to get away with it, other companies will definitely follow suit.

    Do not stop posting about this anywhere you can. Write Apple on their feedback page here or I think it would be more relevant to notify them of the issue on their report page specifically for security and privacy threats here.

    Make it loud and clear that this is not okay and if they decide to actually implement these authoritarian and privacy violating changes, you will sell your phone and stop doing business with them. The only two things Apple cares about is power and money, and we are the ones who give them both. If people continue to rise up and speak out and threaten their wallet, they might cave and change course.

    But seeing as they sent out an internal memo referring to their own customer base as "screeching voices of the minority," we have a long way to go.

    submitted by /u/ISOlatedLens
    [link] [comments]

    Tencent-backed Epic Games warns of Apple 'state surveillance'

    Posted: 07 Aug 2021 01:05 PM PDT

    Spotify reverses decision: "Spotify will support AirPlay 2"

    Posted: 07 Aug 2021 04:34 PM PDT

    People who are *actually* switching away from iPhone. Where are you going and why will you choose that as an option?

    Posted: 07 Aug 2021 01:11 PM PDT

    In light of everything going on, I'm curious about this. Many are claiming they'll switch away from the iPhone. What are you eyeing and what makes that the best option. iPhone only please.

    submitted by /u/ArchiveSQ
    [link] [comments]

    Megathread: On-device scanning for CSAM

    Posted: 07 Aug 2021 10:01 AM PDT

    Hi everyone,

    Photo and iMessage CSAM scanning has been a hot topic recently and we have tried over the last few days to promote the necessary discussion about these features. However, some users are beginning to show frustration that this topic is dominating the sub's feed.

    Today, there is unlikely to be any new news from Apple about these features, and it therefore seems reasonable to keep the discussion centralized to one place.

    Please comment and discuss below freely (but please keep it civil). Thank you.

    EDIT: A poll was run to see if users wanted this kept to a megathread or if we should open the sub back up to all submissions. By a razer thin margin of 119 to 121 at the time of closing (4PM EST), decision has been made to open the floodgates. We still ask everyone to follow the rules when posting and treat each other with respect, but this megathread is now unstickied and we are back to the races. This thread will stay up to preserve the discussion within. Thanks for participating everyone!

    submitted by /u/walktall
    [link] [comments]

    Alternatives for automatic photo backup/syncing

    Posted: 07 Aug 2021 07:26 PM PDT

    I've adored Apple for 15 years, and I also worked for Apple for several of those years. I've spent untold thousands of dollars on their products, and I am DEEP in the ecosystem. That said, I've never felt trapped in a walled garden even a little bit: until two days ago.

    One of my biggest points of pride in Apple has for years been their commitment to privacy, and I think it's one of a small number of traits that truly set them apart. This news about the "child protection" "features" has really rocked my boat. I think they're opening Pandora's box, and as a gay man I'm specifically scared for 1. Kids who's phones might out them to unaccepting parents, and 2. gay people in homophobic countries where Apple could be pressured to use the photo scanning "feature" to find and report other sexually "deviant" images (lookin at you, China). In regards to the San Bernardino shooter's iPhone, Tim Cook called iOS back doors the software equivalent of cancer, and this seems just as bad (if not worse) to me. This isn't unlocking devices for law enforcement, it's non consensual spyware that they're subject ALL of their customers to without evidence. Whether they say it's private or not doesn't matter, any amount of preemptive surveillance without probable cause is way over the line. I don't need anyone to explain to me how the feature works and why it's not really a big deal, I've read tons of articles and I know what hashing is. My concern is what happens when someone at Apple decides or is pressured into using this technology to find something besides CSAM.

    I love Apple so much, but this is really leaving me concerned about their future and our collective future, if there's no big company left who will truly stand up against these types of privacy invasions. I don't want to use the word heartbroken, but I'm severely disappointed in Apple.

    I've found myself a bit rudderless, not knowing where to go from here. I've considered not updating my OSes, turning off iCloud Photo Library, and even ruminating on what possible options I could have for switching platforms. It's not a matter of being concerned about what someone would find in my library, but rather that they're able to snoop through it in the first place.

    I'm curious what solutions other people are considering for photo backups and syncing.

    submitted by /u/hpbrocster
    [link] [comments]

    The community has spoken. Out of a total of 240 votes, 121 of you voted that we shouldn't make a megathread for the recent CASM developments. We won't.

    Posted: 07 Aug 2021 01:13 PM PDT

    FINAL RESULTS

    I understand this may be controversial considering the "no megathread" team won by 2 votes. As such, we will keep monitoring the discussions, gauge the community as things keep developing.

    As we have told some of you yesterday and today, we will make sure to avoid duplicate posts on the subreddit to make sure discussions are in as few threads as possible.

    Feel free to reach out to us via modmail to express your opinions. We don't bite.

    __

    I really, really thought this poll was interesting due to how close it was. Neither team took a lead higher than 5. I took a few screenshots showing how close this whole thing was (kinda wish I took more)

    At 2:52 EDT (about 24 minutes after poll was posted), votes were split 50/50

    At 3:23 EDT, the "yes megathread" team had the lead with 5 votes.

    At 3:34 EDT, the lead was cut down to just three

    At 3:54 EDT, the lead did not change

    At 3:56 EDT, the "no megathread" team took the lead with 4 votes (no actual screenshot, sorry)

    At 4:00 EDT, the lead was cut to just two

    While this does not count as it happened after the cut off time, as of 4:28 EDT the votes are tied at 125, which just goes to show that we do need to keep an eye out on things

    submitted by /u/exjr_
    [link] [comments]

    Now more than ever we need Jailbreak to become relevant again

    Posted: 07 Aug 2021 11:57 AM PDT

    I haven't jailbroken my main device since the 3GS era, but in the light of recent events, now it's the time for Jailbreak to shine again. A simple tweak could get rid of Apple's new "surveillance machine". I know the type of security holes needed for untethered are rare nowadays, but I really hope we see a Jailbreak renaissance sooner than later.

    submitted by /u/ermonas
    [link] [comments]

    Not an apologist. Is it possible apple is being forced into this and also has a gag order?

    Posted: 07 Aug 2021 05:58 PM PDT

    Remember when companies started reporting call history and stuff to the feds? But also had gag orders and were not legally allowed to say anything? Finally it was broken down a little and a company could give a range of requests it has received?

    Apple is not stupid. They knew this would be poorly received. But they didn't hide it. (That we have evidence of). Apple is also not defending themselves. Which is sort of like apple, but usually when apple steps in shit they respond with a simply worded open letter.

    I would not be surprised if non apple companies are already providing this information via other means. Maybe their "iclouds" are already open, or their devices can simply have this sort of software installed at will by "the authorities".

    This is by no means a good thing in any way for those of us without power. And I am not saying you shouldn't react.

    I am trying to get past the first layer of the story and dig deeper.

    submitted by /u/DrFloyd5
    [link] [comments]

    Do you think there’s anything Apple could do to fix what’s happened as of late, or do you think it’s too late and their reputation is now ruined?

    Posted: 07 Aug 2021 04:55 PM PDT

    I'm curious if you think Apple could fix everything that's happened in the past few days. I can understand why people would boycott Apple as a brand and company, but in saying that, are there any better options out there that won't eventually turn to this as well?

    submitted by /u/aflamingflamingo
    [link] [comments]

    How to Install and Dual Boot Linux and macOS

    Posted: 07 Aug 2021 07:32 AM PDT

    Would this CSAM reporting feature even work now that Apple has detailed the technical details of the system?

    Posted: 07 Aug 2021 07:14 PM PDT

    After spending time thinking about this news, I'm wondering if it will even be effective against criminals in the first place, or if it just opens up a vulnerable vector for authoritarian governments. Here are my thoughts about what criminals could do while continuing to use iPhone with iCloud Photos:

    A third party app with access to your photo library could encrypt and decrypt photos.

    1. Don't have iCloud Photos turned on.
    2. Download Criminal Photos™ on your device.
    3. Set an encryption password in the app, and grant full photos access.
    4. The app encrypts each image in your library. In the Photos app, your photos would be replaced with images of static. In the third party app with your decryption password, you'd see the normal photo.
    5. Turn on iCloud Photos and continue living undetected by the CSAM detection system.

    You couldn't use the Photos app anymore without seeing a bunch of visual noise, but a NeuralHash wouldn't match static to any images in the CSAM database.

    As long as the criminal never saves photos directly to the photos app, they'd be invisible to detection. You could also conceivably put individual encryption passwords on photos, share them with all of your criminal friends, and then supply them with the decryption password. They could save the picture directly to Photos, and then would view it in the third party app with the shared password.

    I don't believe this violates any App Store policies at the moment. Yes, Apple could change App Store policies to prevent this, but at the current time, it would be technically allowed.

    If Apple banned it, criminals could still side-load apps with similar functionality. The types of people who would do this are the people who have tons of CSAM. Maybe they even make money from distributing it. The worst of the worst. And at best, you'd only catch the idiots or the new entrants into the world of exploiting children. I don't think this would make a significant impact on reducing the distribution of CSAM.

    Then if Epic ends up convincing a court that there must be alternate app stores, it's just all over. I guess Apple could ban the app binary from running, but then another app with the same features would pop up with a clean binary.

    To me, this seems very possible. It feels like the system could be evaded, and the net impact would be minimal, while also exposing all common users to a surveillance vulnerability.

    submitted by /u/iNeedAnAnonUsername
    [link] [comments]

    ATP on Apple’s security bug bounty system

    Posted: 07 Aug 2021 05:51 PM PDT

    Apple's SCAM is the fatal blow to knock down Apple's hard-earned privacy and security image.

    Posted: 07 Aug 2021 07:12 PM PDT

    What is privacy? Privacy is the ability of an individual or group to seclude themselves or information about themselves. Scanning my photos is invading my privacy. No matter Why.

    Apple was fighting to protect privacy by keeping the FBI from viewing dead terrorists' phones in 2015; now it's challenging your privacy over potentially abusive child pornography content. Such a stark contrast leaves us with some questions. What criteria does potential criminal activity meet to warrant the creation of this kind strategy like SCAM to invasion of our privacy? What about criminal activities such as terrorism, murder or rape? And how should established criminal activity be handled?

    Apple's next strategy may be to promote how important and right it's to protect childre & how SCAM's working mechanism balances privacy. I can't agree more. As stated above, by scanning my photo you are invading my privacy, no matter how safe it is and how little impact it has. Then I have no privacy to speak of. And there's no reason to get there by trampling on our privacy. By that logic, the government could scan all of our information to circumvent all potential criminal to protect us all.

    It's really hard for me to speculate on what Apple is trying to achieve with SCAM finally. But I don't think they can get it easily, it's must be a huge cost. As soon as SCAM is officially launched, privacy becomes a joke for Apple, and all the previous investment in establishing a privacy image is wasted. Double cost.

    But it's not all bad for the industry, as long as competitors can give Apple a fatal blow with "do not scan your phone" as a selling point. It will even activate the market for cell phones sold on security, BlackBerry may be able to use this opportunity to resurrect.

    submitted by /u/MiloDimensi
    [link] [comments]

    Apple has updated the wording around Photos on their privacy page.

    Posted: 08 Aug 2021 01:37 AM PDT

    Screenshots here. Taken from [apple.com/privacy](apple.com/privacy)

    What's been removed:

    Apple devices are designed so those memories don't leave your hands until you share them.

    Some services process photos in the cloud, which gives them access to your photos. But we designed Photos to process your images right there on your Mac, iPhone and iPad.

    I'm curious as to why that change of wording, going by their previous explanation the first bit that's been removed would still apply.

    submitted by /u/evenifoutside
    [link] [comments]

    iBook Death Screech // When Vintage Apple Hardware goes Wrong ��

    Posted: 08 Aug 2021 12:52 AM PDT

    Apple's CSAM changes: personal thoughts

    Posted: 07 Aug 2021 04:53 PM PDT

    Alright, long post - apologies in advance.

    Some FYIs to start with

    1. Apple (the company's servers, or its employees) can never look at your files on the device unless you back it up to iCloud. All device data is secured with FaceID/TouchID which is controlled by the Apple T2 chip or the Apple M1 chip. This hardware is completely offline and cannot be accessed by Apple.
    2. The only trustworthy piece I have referred before posting this is https://www.apple.com/child-safety.
    3. Apple already "scans" or analyses all your local files. That's how "Hey Siri, show me photos of the kids at the beach" works.

    ----

    What do the recent changes mean to you?

    From the NCMEC and other child safety organisations, Apple obtains a database of known CSAM hashes (not images) and are stored on the user's device. There is a matching algorithm that helps flag a given photo as potential CSAM content. This algorithm spits out whether there is a match, and what the match level is (you can assume match level of 0 is weak, but level 100 is definitely CSAM content)

    1. As an iMessage user. Whenever you send/receive images on iMessage, and the on-device machine learning algorithms flag that it might potentially be unsafe content, iMessage alerts you upfront and lets you decide if you really want to exchange that image. This happens completely on-device, so your images and data are still safe from being exposed to Apple.
    2. As an iCloud Photos user. Prior to every photo being synced to iCloud, the above matching algorithm is run on the device. The results of the matching algorithm are safely packed into what Apple calls a "cryptographic safety voucher", and is synced to iCloud along with your actual photo. When this voucher reaches iCloud (the server), Apple calculates whether the match level (from the voucher), exceeds a certain threshold. Example - a revealing pic of someone above the legal age will not cross the threshold, but the same of someone who is a pre-teen might cross it. In case this threshold is crossed (Apple says this check won't fail more than once in a trillion matches), then (and only then) do a set of Apple employees actually will be able to look at the actual photo. After this human verification, if it's confirmed to be CSAM content, the account is deactivated and the NCMEC is notified of this. Apple can't see any other content that doesn't cross this threshold. So your early morning bad-hair pics are safe from ridicule.

    ----

    Should I be worried?

    Depends.

    With the current piece I have reviewed, there might be little to worry about, since this does not give Apple any more access to your online content than it already has. Remember - you photos on the Apple devices are still inaccessible by Apple, or local authorities.

    Having said that, I think most of the concerns stem from the sense of impending doom - "OK, so what's next?", which I think is fair. Apple or the local authorities can still not view the images of your photo even if their confiscate your devices. Even if the local authorities appeal to Apple and request to unlock your phone, I think the February 2016 conflict between Apple and the FBI clearly shows Apple's commitment.

    Next, "what next after CSAM?" - good point. There are many social evils (though most of them are subjective), but I think we as a society can agree to a handful of them such as CSAM, animal cruelty etc. These should be eradicated and I'm sure everyone is with me on that. But what if governments (both the US and elsewhere) oblige Apple to enforce more of these "scans" which better suit their agenda (like the Chinese government)? Yes this a valid concern, which goes against free speech, and Apple knows very well that there will be backlash if this ever comes to light. Even this one already did face backlash. If Apple bends over to governments' requests, we definitely should be worried.

    ----

    What should I do?

    Ground your objections with valid research. I understand it's a very titillating piece of news floating around right now, but before assuming the worst, it won't hurt to consult someone technically adept to help you understand the implications of these changes. If you still feel your objection is founded, nobody would/should judge you.

    Hold companies accountable to their policies and changes. If user privacy is really under threat, I'll join the fight too.

    If I've gone wrong in my understanding above, please correct me. For the tech savvy folks out there, you could look at the below links from Apple's Child Safety website. The page also has technical assessments from independent researchers.

    https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf

    Yours,

    Just an iPhone user like many here are.

    Happy to help more. DM me if you want to chat.

    submitted by /u/munukutla
    [link] [comments]

    No comments:

    Post a Comment

    Fashion

    Beauty

    Travel