I have serious issues with discussion in the opening segment of Security Now where Steve and Leo discuss the recent Apple backdoor implementation.

Various quotes:

“In fact, I think that the case could be made that it would be irresponsible for Apple not to provided such a back door.”

“…most CEOs who are in the position to understand that with great power comes great responsibility…”

“I believe that they absolutely will protect the privacy of their users to the true and absolute limit of their ability.”

“You always have had a way in, you just didn’t tell anyone.”

-Steve

These quotes speak to the corporate entity of Apple.

They go on to describe supporting this idea because: “What is Dr. Evil had the launch codes…”. They use the scenario of the risk is worth it because of the potential for saving the world.

Discussion:

I feel most people assume their devices can be compromised by advanced agencies like the NSA for Americans, or by law enforcement proceedure, like a court order.

I don’t have an iPhone, but I wonder if that is in the terms of service?

  1. Should Apple have to tell you that they built in a backdoor that can open your device no matter what you do and by using the device, or agree to that? Maybe it’s already there, maybe it truely was a secret, I’m more asking from the standpoint of should the comsumer/user be told?

  2. Should the vendor be forced by law to reveal this? Arguments for yes are obviously revolve around privacy. I guess arguments against is criminals/bad actors will deliberately not choose this product? Non-American governments already have that policy because, duh, US Government has power to compel American businesses…

  3. Do you think we should all accept this in the social contract of law and order? In order to keep citizens safe, the government must have the legal and technical ability to conduct legal search and seizer?

  4. Would a backdoor into every technical item be OK under the circumstances that a court order was issued? If no, what about the time that a young kidnapped girl could have been found alive if authorities could have unlocked an iPhone (or any device)? If yes, what about every [insert your term for a government you think is bad here] government around the whole that issues their version of a “legal” process/order to unlock people’s devices for the purpose of “national security”, which based on your personal views, may be oppression or human rights violations?

  5. Is there going to be a class action law suite coming? Should there?

I don’t like the current state of this, change my mind.

Transcript of episode: https://twit.tv/posts/transcripts/security-now-956-transcript-

  • Lutra@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    11 months ago
    1. In terms of terms of service -this is not in the terms of service. Its a secret social contract. What do we know about the lockset on our doors? not much. What do we know about the company that made it’s ability to make keys? not much. There is a trust that the creator will know things that we wont, and for everyone’s betterment, they go to the grave with that knowledge.

    Security is always temporary. Security puts an obstacle in the path of the treasure, it doesn’t seal off the treasure. That’s not how the real world can work. Bury it in concrete, seal it in steel. If the owner can get it, with enough time, the theif can too. Perfect security isn’t real.

    1. Should they be forced - how can you? There are a thousand vulnerabilities to every product, its just that we don’t usually care so much. This is the idea behind many openSource ideas. We all know. In reality, businesses make and keep secrets.

    2. It already is a social contract. It just seems important because now it’s concerning something we care about.

    3. This is the struggle of law and order. To create laws that are never self-contradicting. Laws that don’t need exceptions. It’s hard math. Each society decides what IT values, and then makes laws around those values. Every fireman has a protected right to not simply break in to my home, but destroy my home in order to save lives inside it. It happens every day. They don’t come with keys, they come with battering rams and axes.
      two things are different though- We trust them, based on years and years and years of faithful service. They are honest. the second, is their actions always leave Clear evidence that they did something. I wouldn’t come home and wonder if the fire department has been in the house. I would see the broken window and smashed in door and know. With the phones - we don’t know if anyone was in, and this is very very different. There’s nothing that prevents the phone from flashing a bright red warning that its been opened from the inside - except if the person disables the alarm :-) but its possible.

    17 years ago Apple stated that they have a ‘kill switch’ for the apps, and this is similar. What do you do if a million phones go wild. If you could have set up a kill switch, would you regret not doing it.

    What does it mean? It means that people who use these things HAVE to put trust in the person who made it. In the same way I have to trust in VW or FORD if I sit in one. There is no using the thing, without putting a tremendous amount of trust in the person who made it.

    • RedFox@infosec.pubOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      11 months ago

      The analogy of the fire department was a good one. Also a very good point about door locks.

      I have similar thoughts about the electronic security alarm in my house when I hear the rare security vendor employee insider threat that broke into someone’s house by turning off the alarm. I still have one though. Like you said, I just chose to mostly trust them based on the hope they’ll internally police themselves out of their self interests to keep a good reputation and make money.

      I do wish legislation could force vendors to be very transparent with their privileged access instead of the consumer or user just assuming it. Like a surgeon generals warning, “we made this, so we can unlock it. We are also forced by law to tell you when we’ve done it”. This of course is unrealistic from a national security or investigation standpoint.

      Also good points about trust. We have to trust an enormous amount of institutions like banks, public service agencies, etc. Americans are having trust issues with law enforcement as a result of too many issues or abuse.

      I wonder how many people care more about the secrets in their phone than their bank account?

      I also don’t live in a country where I can reasonably expect the government to spy on me and take my freedom just for criticism, so I guess the stakes aren’t as high. The airdrop cracking in China comes to mind. Plenty of countries are being accused of using spyware against journalists and opponents.