Fix major API design/security flaw: Denial of service on large quantity of users #34

Open
20kdc wants to merge 1 commits from 20kdc/main into main
20kdc commented 2023-07-19 21:04:09 +02:00 (Migrated from github.com)

The API's existence introduces a long-term denial-of-service attack against large classes of users, including but not limited to large classes of marginalized users.

Users who are running outdated devices will, over time, find themselves effectively "removed from the internet" where this API is applied.

We have seen similar effects on a smaller scale with the advent of Intel Secure Enclave hardware being required to view EME-protected content on certain media websites. However, those effects are limited to media, and particularly services willing to operate on that basis.

Such an API becoming available for general use in all contexts guarantees the effects becoming magnified across many types of content.

Marginalized users running outdated hardware will be affected.

Not to mention people who are simply trying to get more performance out of up to date, but weak hardware, in ways which cause the attestation to fail (Don't think I don't know where you got the term "Attestation" from. It's obviously going to involve Secure Enclave or equally overbearing system examination, particularly for the anticheat usecase).

Using Linux? No attestation; The kernel can't be trusted, after all.
Using Windows with drivers that don't fit the requirements? No attestation.
No TPM? No attestation.
No Secure Enclave? No attestation.

And so on.

This is a logical guarantee short of magical divine intervention, which as I hope you know, won't happen.

The only fix is deletion.

The API's existence introduces a long-term denial-of-service attack against large classes of users, including but not limited to large classes of marginalized users. Users who are running outdated devices will, over time, find themselves effectively "removed from the internet" where this API is applied. We have seen similar effects on a smaller scale with the advent of Intel Secure Enclave hardware being *required* to view EME-protected content on certain media websites. However, those effects are limited to media, and particularly services willing to operate on that basis. Such an API becoming available for general use in all contexts guarantees the effects becoming magnified across many types of content. Marginalized users running outdated hardware *will* be affected. Not to mention people who are simply trying to get more performance out of up to date, but weak hardware, in ways which cause the attestation to fail (Don't think I don't know where you got the term "Attestation" from. It's obviously going to involve Secure Enclave or equally overbearing system examination, particularly for the anticheat usecase). Using Linux? No attestation; The kernel can't be trusted, after all. Using Windows with drivers that don't fit the requirements? No attestation. No TPM? No attestation. No Secure Enclave? No attestation. And so on. This is a *logical guarantee* short of magical divine intervention, which as I hope you know, won't happen. The only fix is deletion.
theseer (Migrated from github.com) approved these changes 2023-07-19 21:50:32 +02:00
klrtk (Migrated from github.com) approved these changes 2023-07-19 21:51:01 +02:00
mpldr (Migrated from github.com) requested changes 2023-07-19 21:53:19 +02:00
mpldr (Migrated from github.com) left a comment

I think a more detailed explanation on why this idea is a well-written declaration of war towards the open internet would be useful.

I think a more detailed explanation on why this idea is a well-written declaration of war towards the open internet would be useful.
brendanno (Migrated from github.com) approved these changes 2023-07-19 22:23:00 +02:00
Zumorica (Migrated from github.com) approved these changes 2023-07-19 22:37:02 +02:00
BinBashBanana (Migrated from github.com) approved these changes 2023-07-19 22:38:30 +02:00
20kdc commented 2023-07-19 23:05:08 +02:00 (Migrated from github.com)

I think a more detailed explanation on why this idea is a well-written declaration of war towards the open internet would be useful.

I'm not good at war speeches. Write your own. But I do have this:

Let's define these entities:

  • The Server: Is simply using this API on a wider range of content than explicitly mentioned. (I put to the record that I do not say "than intended".)
  • The User: A user doing any action not authorized by The Attestor(s).
  • The Browser: The browser the User has trusted and is running on their hardware.
  • The Attestor(s): These entities are responsible for defining Attestation Conditions and for verifying the results thereof.
  • The Attestation Conditions: These can be modelled as arbitrary kernel-level executable code. For machines that are not explicitly planned for, the Attestation Conditions are assumed to fail. For any given layers of emulation (this will be important later), the Attestation Conditions are assumed to be capable of defeating it using Secure Enclaves and other dangerous technologies, and therefore are assumed to fail. The supplying Attestor has arbitrary choice, by nature, over these Conditions. However, the goals of this specification are not possible without absolute conditions that deny emulation.

The reason the Attestation Conditions are defined the way they are is because they are the requirements of the stated goals of the specification (for example, the anticheat part).

The definition of an open web, at a minimum (there are other hypothetical ways a browser could be built, but I would prefer not to make a mistake here), requires that, assuming:

  • A person or group independent from any other body, including software, business relationships, etc.
  • Who has a Turing Machine or a Bounded Storage Machine of sufficient size and power, with an Ethernet connection to the Internet, and I/O devices sufficient for what is being browsed. (I will not use a precise definition here as there is none. Text-to-speech and VR are both within the gamut of possible output devices for different users and usecases. It is enough to say that the output device should be treated as directly controlled by the Turing Machine and does not contain independent processing power. The most obvious loophole if this was not required would be using a laptop with non-independent code as an 'I/O device'.)
  • Cryptographic locks are unbreakable unless they are known to be relatively trivially defeated. (For example, basing a spam filter on having to first defeat an MD5 hash only gets dangerous once it's timed.)

The following must be possible:

  • A complete Browser must be creatable and usable.
  • Assuming the network is so configured, a complete Server must be creatable and usable.

The web should be considered, these days, as a big interpreter, or emulator.
This is important because it makes the definition of independence in regards to software a little weird due to JavaScript, but if you understand it as an emulator, it's simple.

If there are elements which inherently can't be emulated (by, for example, assistive technologies), then the web isn't open, because the reason you can't emulate those elements must be because those elements cannot be substituted by the user, implying the user couldn't independently replicate them, implying the above requirements are violated.

It is also worth noting that "Servers do not have to use this specification" does not mean anything if Servers do widely use this specification.
Giving someone else the choice to destroy the open web, framed in the cushioning blankets of security, does not absolve one of essentially pushing the button oneself.

The design of this specification is that to gain access to services provided by the Server, the Browser must do one of:

  • Have entered into a direct relationship with an Attestor. This breaks the independence clause. In addition, the Attestor would presumably only make this agreement if and only if the Attestation Conditions were implemented by the Browser.
  • Successfully execute Attestation Conditions specifically supplied by an Attestor. This isn't Browser-provided and they fail for a machine that wasn't explicitly planned for.
  • Defeat the cryptographic requirements and thus forge an attestation.

In all of these cases, an assumption above is broken. It is therefore clear that this specification violates the definition of an open web.

> I think a more detailed explanation on why this idea is a well-written declaration of war towards the open internet would be useful. I'm not good at war speeches. Write your own. But I do have this: Let's define these entities: * The Server: Is simply using this API on a wider range of content than explicitly mentioned. (I put to the record that I do not say "than intended".) * The User: A user doing any action not authorized by The Attestor(s). * The Browser: The browser the User has trusted and is running on their hardware. * The Attestor(s): These entities are responsible for defining Attestation Conditions and for verifying the results thereof. * The Attestation Conditions: These can be modelled as arbitrary kernel-level executable code. For machines that are not explicitly planned for, the Attestation Conditions are assumed to fail. For any given layers of emulation (this will be important later), the Attestation Conditions are assumed to be capable of defeating it using Secure Enclaves and other dangerous technologies, and therefore are assumed to fail. The supplying Attestor has arbitrary choice, by nature, over these Conditions. However, the goals of this specification are not possible without *absolute* conditions that deny emulation. The reason the Attestation Conditions are defined the way they are is because they are the requirements of the stated goals of the specification (for example, the anticheat part). The definition of an open web, at a minimum (there are other hypothetical ways a browser could be built, but I would prefer not to make a mistake here), requires that, assuming: * A person or group independent from any other body, including software, business relationships, etc. * Who has a Turing Machine or a Bounded Storage Machine of sufficient size and power, with an Ethernet connection to the Internet, and I/O devices sufficient for what is being browsed. (I will not use a precise definition here as there is none. Text-to-speech and VR are both within the gamut of possible output devices for different users and usecases. It is enough to say that the output device should be treated as directly controlled by the Turing Machine and does not contain independent processing power. The most obvious loophole if this was not required would be using a laptop with non-independent code as an 'I/O device'.) * Cryptographic locks are unbreakable unless they are known to be relatively trivially defeated. (For example, basing a spam filter on having to first defeat an MD5 hash only gets dangerous once it's timed.) The following must be possible: * A complete Browser __must__ be creatable and usable. * Assuming the network is so configured, a complete Server __must__ be creatable and usable. The web should be considered, these days, as a big interpreter, or emulator. This is important because it makes the definition of independence in regards to software a little weird due to JavaScript, but if you understand it as an emulator, it's simple. If there are elements which inherently *can't* be emulated (by, for example, assistive technologies), then the web isn't open, because the reason you can't emulate those elements must be because those elements cannot be substituted by the user, implying the user couldn't independently replicate them, implying the above requirements are violated. It is also worth noting that "Servers do not have to use this specification" does not mean anything if Servers *do* widely use this specification. Giving someone else the choice to destroy the open web, framed in the cushioning blankets of security, does not absolve one of essentially pushing the button *oneself.* The design of this specification is that to gain access to services provided by the Server, the Browser *must* do one of: * Have entered into a direct relationship with an Attestor. This breaks the independence clause. In addition, the Attestor would presumably only make this agreement if and only if the Attestation Conditions were implemented by the Browser. * Successfully execute Attestation Conditions specifically supplied by an Attestor. This isn't Browser-provided and they fail for a machine that wasn't explicitly planned for. * Defeat the cryptographic requirements and thus forge an attestation. In all of these cases, an assumption above is broken. It is therefore clear that this specification violates the definition of an open web.
Tau5 (Migrated from github.com) approved these changes 2023-07-20 00:17:43 +02:00
perillamint (Migrated from github.com) approved these changes 2023-07-20 04:12:36 +02:00
stop-treachery (Migrated from github.com) approved these changes 2023-07-20 12:44:24 +02:00
Leo40Git (Migrated from github.com) approved these changes 2023-07-20 14:50:56 +02:00
Leo40Git (Migrated from github.com) left a comment

Can the internet stop self-destructing for TWO MINUTES

Can the internet stop self-destructing for TWO MINUTES
cybik (Migrated from github.com) approved these changes 2023-07-20 16:14:01 +02:00
chfour (Migrated from github.com) approved these changes 2023-07-20 19:35:48 +02:00
cybik commented 2023-07-20 22:41:17 +02:00 (Migrated from github.com)

@Leo40Git ostensibly not.

@Leo40Git ostensibly not.
comradef191 (Migrated from github.com) approved these changes 2023-07-21 00:21:45 +02:00
comradef191 (Migrated from github.com) left a comment

Of course, the creators of this are affiliated with Google;
I'm sure you can work out the implications of that yourself.

Of course, the creators of this are affiliated with Google; I'm sure you can work out the implications of _that_ yourself.
This pull request can be merged automatically.
You are not authorized to merge this pull request.
You can also view command line instructions.

Step 1:

From your project repository, check out a new branch and test the changes.
git checkout -b 20kdc/main main
git pull origin 20kdc/main

Step 2:

Merge the changes and update on Forgejo.
git checkout main
git merge --no-ff 20kdc/main
git push origin main
Sign in to join this conversation.
No description provided.