-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for jcs-based proofs according to specification #19
Comments
@rflechtner hmm, yes, you're right. I'm just commenting to confirm that we need to address this issue as it makes it difficult to use JCS-based cryptosuites with the RDFC-based cryptosuites (which was a design goal). I expect that using We could give advice like: If you're going to use JCS-based and RDFC-based cryptosuites on the same payload, do your signatures using the RDFC-based cryptosuites first (which would do any context injection necessary) and then the JCS-based cryptosuites (which would just sign over the injected contexts)... but that feels like brittle advice. Or, perhaps the guidance should be: "Use Embedding the context values in JCS proofs feels brittle as well... special processing for a cryptosuite used by people that tend to not want to deal w/ We'll have to think about what the right path forward here is... it's not immediately obvious given the various other forms of cryptosuites that are in the works (like the one for Data Integrity-based CL Signatures). |
@msporny I agree we should aim for a robust solution rather than patch the problem with brittle advice, even if that takes a bit more investment - if we spare that effort now it will most certainly come back to haunt us further down the line. I do think we should give this option some thought: making context injection an internal feature of linked data proofs, such that documents are no longer edited in any way during signing. I get that some do not like the idea of having a separate What's your opinion on that direction, does this seem feasible? |
Hi @msporny, any updates on this? Also wondering if this thread should move to a different forum (e.g., https://github.com/w3c/vc-data-integrity/issues) and if the topic warrants discussion in the relevant DIF WG? |
Yes, I discussed this with @dlongley briefly, and on a couple of internal calls. There was a solution proposed that seems like it would work, but it slips my mind on what it was.
Which DIF WG were you thinking of? I didn't think that DIF worked on Data Integrity-related items?
Yes, let's move the discussion to the vc-data-integrity repository... I want to get the others implementing the JCS version of Data Integrity involved. Whatever we do will need buy in from all of the implementers. Apologies for the delay in response... we are heads-down right now in the W3C VCWG trying to get all the Data Integrity specifications and cryptosuites across the global standards line and so much of our focus is there. There are only around 6 months left and still lots of work to do to ensure there are enough implementations of the specification. I will also note that the JCS implementations are dragging and we'll have to remove those features from the specifications if we don't have more demonstration of implementations (and test suites). All that said, yes, please, raise an issue so we can officially bring it to the VCWG's attention. |
Yup, sorry, that was a brain-glitch, don't think they do. I was thinking of the W3C VCWG.
We've been working on an implementation of a
Will do, and I'll ping Dave in the hopes that he remembers details about the solution you came up with. Thanks! |
I believe the solution was, for just the JCS cryptosuite proofs, to include the context from the document (and any JCS cryptosuite context, if necessary) in the JCS cryptosuite proof object itself ( This would avoid disturbing other cryptosuites that do not have this issue that rely on using the top-level |
Yes, it would help a bunch. Thank you! We plan on also doing an implementation, as it's fairly trivial given that we have the other cryptosuites already implemented and we're just talking about swapping out the canonicalization algorithm. We need a minimum of two independent implementations (yours and ours would meet that bar)... there are two other implementers that we know of that have promised implementations as well. One caveat is that you'll have to set up a server (or Docker image) that exposes an API to "issue" and "verify" so we can run the W3C VCWG tests against your software: https://github.com/w3c/vc-di-ecdsa-test-suite#implementation Do you think you'd be able to do that in the next 3-5 months? |
Thanks for clarifying! This seems like a reasonable compromise so far. I'll have to take some time to wrap my head around how this must be done to maintain integrity protection for the context values and hence for the semantics of the secured document, but I'll open an issue including your suggestion tomorrow. Feel free to correct me where I'm misrepresenting your ideas. I'll also make sure to mention the constraint that we can't break existing rdf based suites and implementations. |
I'm confident that we can do that, yes. Depends a little bit on what the scope is; our cryptosuite implementation also works as a drop-in replacement for other suites used with your library, which, as you mentioned should be fairly trivial to do. If we should rather provide a solution that works standalone without relying on any of your code, we're pretty close to that too - we've got a working implementation of at least basic signing and verification. |
Great!
I believe that as long as the cryptosuite is implemented by you, it should be fine... however,
This option would be better. I know it's a taller ask, so if you're able to do it, wonderful -- that's fully defensible in the WG as a completely independent implementation. If not, there might be some individuals that object because it's not a completely independent implementation (which is debatable, but they'd certainly debate the "it doesn't count unless is a full top-to-bottom implementation). Any contribution along the lines above will help us get over the finish line, so thank you! |
@msporny What we would have to align on though is how to handle the injection of the document context on the proof (I'm talking about this line: data-integrity/lib/DataIntegrityProof.js Line 371 in 674d5c1
eddsa-jcs-2022 specs says should not be done. Do you plan to change the library's approach to creating proof data so that it accommodates cryptosuites that do not do this or do this differently? Or should we reimplement createVerifyData on our jcs suite, which from what I can see replaces the default behaviour?
|
At this point, we should presume the
I expect it's the latter, but don't have time in the next week or two to do a deep dive. I'm requesting an opinion from @dlongley and @davidlehn in this thread, which we might want to document here as well: I'm heads-down trying to get VC v2.0 to the W3C Candidate Recommendation (CR) phase (we're behind schedule by ~2-3 months there). After that is done, I have to circle back around to get BitStringStatusList to CR, and then I'll circle back to the Data Integrity suites to finish them up so we can go through a 2nd CR phase (we'll be forced to do this due to the issue above and because of requests for changes from Google). That doesn't mean that implementers (such as you, @dlongley, @davidlehn, @HelgeKrueger, and @silverpill) shouldn't figure out the right way to do this and try it out in your implementations. Please do that so that when I get around to writing the text, there is consensus among the implementers on what should be done here. |
@msporny I don't quite understand the problem being discussed here. It seems to occur when proofs based on RDFC and JCS are used together in one document? In our case only JCS is used
As far as I can tell, the proposed solution is injection of |
Yeah, we can do that to ensure we don't break production deployments. Could you please link to/list the projects that are using it in production -- it helps us make the case about the uptake of this technology outside of W3C Member companies. |
I don't have data on usage in the wild. At least one implementation (Mitra) already produces and distributes signed objects containing |
Ok, thanks for the update. That's useful information that we can take back to the WG. There are clearly more than two independent implementations at this point, so all we need to do is prove that (via the test suite) and that'll ensure that the feature doesn't get cut during the W3C Candidate Recommendation phase. We will absolutely need your help to prove that, though. If Mitra, Bovine, Vervis, Streams, and the others don't submit endpoints or Docker images proving that they pass the test suites, there are some in the group that will attempt to cut the feature.
Looks like there is some data here? I don't know if that is counting the right thing, though: https://mitra.fediverse.observer/list One option here is to update Mitra to check signatures both ways... W3C counts "significant deployments" in the "tens of thousands to millions of people affected". I can see WG members noting that the deployment isn't significant yet and so breaking changes, especially because it's not a standard yet, is ok. That said, we don't want to cause you pain if we can avoid it. Can you see any other way of updating |
Perhaps the existing implementations could be updated to allow both older proofs using |
@msporny If you're referring to EdDSA Cryptosuite test suite, I think it is unlikely that any of us will make an effort to pass it. The mentioned projects are social web servers, there's no reason for them to implement VC API. If WG will try to remove
I have no control over existing deployments. A breaking change at this point would lead to a lot of coordination problems, so it should be avoided. Also, injecting |
I can see that adding a conditional rule to the spec could fix the issue while not breaking existing applications. E.g., if we say that when creating a proof, the document SHOULD be added to the proof. However, during issuance, we require updating the document context only if the proof has a A remark on the concern by @silverpill:
The data integrity specifications mandate context injection, so if they are implementing them to the letter, this should already be happening. If not, it can't be guaranteed anyway that other implementation would be able to verify the proofs created. It also sounds more complicated than it is; in the end, it is just operating on a set of strings. Context injection for example is adding a fixed string if it's not already in the set, and what we are proposing as a solution is comparing sets of strings and overriding the set if necessary. |
That's really unfortunate, I wish you would have conveyed that before we put in the effort to get it spec'd and adopted by the group. If our primary group of implementers aren't going to make an effort to demonstrate that they're conformant with the specification the WG will almost certainly remove it from the specification. At this point, we're going to have to depend on a different group of implementers to integrate w/ the test suite and demonstrate conformance (which is a requirement for all global standards produced by the W3C -- we have to demonstrate two independent interoperable implementations). Integrating w/ the test suite isn't difficult, you just put up an endpoint that can digitally sign a payload and verify a payload. Are you saying that the fediverse community won't even put in that level of effort? I guess the same questions goes out to @HelgeKrueger and @rflechtner? At this point, I believe Digital Bazaar will submit an implementation for testing via the test suite... at a minimum, we need at least one more independent implementation. |
@rflechtner Is it against the spec to sign JSON objects without the In the following example, should {
"@context": [
"https://www.w3.org/ns/activitystreams",
"https://w3id.org/security/data-integrity/v2"
],
"object": {
"content": "test",
"proof": {
"type": "DataIntegrityProof",
"cryptosuite": "eddsa-jcs-2022",
...
}
}
}
@msporny If it's really that simple I think somebody could do it. When do you expect the group to reach a decision on Also, why the test suite requires VC API? Data Integrity standard has many use cases beyond Verifiable Credentials, I'm surprised that such requirement has been introduced. |
@msporny Note: I haven't read through the details about what is being modified with
I'm planning to implement the test suite. However, when I first looked at https://github.com/w3c/vc-di-eddsa-test-suite/, I was confused, so it got low priority. The explanation is easier to understand now. Just to clarify what is needed.
{
"name": "My Company",
"implementation": "My implementation",
"issuers": [{
"id": "",
"endpoint": "https://test.example/issue",
"method": "POST",
"tags": ["eddsa-jcs-2022"]
}],
"verifiers": [{
"id": "",
"endpoint": "https://test.example/verify",
"method": "POST",
"tags": ["eddsa-jcs-2022"]
}]
} |
Answering first to @silverpill:
Yes and no. Signer implementations are expected to add the
That's a more subtle issue, because it depends on what the 'secured document' is here. Are you signing over the whole structure, or only over the contents of The way I see it, if you are doing the former, this may be a valid implementation, although it would be first time I see that the proof property is added to a nested object. I can't tell if the proof being a top-level property is a hard requirement, what is your take on this @dlongley & @msporny? I can say though that many implementations will assume the proof to be present in the root of the document, so this would hurt interoperability. The latter implementation however would be more problematic, because it would mean that the |
As discussed, we're happy to provide an integration with the test suite at the very least for our cryptosuite implementation, using |
@HelgeKrueger wrote:
Yes, that's more or less accurate. You don't need to do the controller document if you use a @BigBlueHat, @JSAssassin, @aljones15 can help you through the details when the time comes to integrate... they're the ones that are putting together the test suite.
@BigBlueHat, @JSAssassin, @aljones15, please note the sub-par experience that @HelgeKrueger had wrt. integrating w/ the test suites. I agree that the documentation on how to do it is still quite difficult for implementers to understand, and it sounds like we've already lost one implementer because we've made it seem more difficult than it is to implement... and given how precious each implementer/implementation is for the standards ratification process, we need to improve things from where they are right now. |
@HelgeKrueger wrote:
@rflechtner wrote:
Excellent, then we're back in business! That means there will most likely be 3 independent implementations, and all we need is two. |
@rflechtner wrote:
A completely independent implementation would be a "nice to have", not mandatory. The DID lookups you're doing now are not a requirement for the cryptosuite implementation; we have many other implementations that are implementing that portion of the specification. It's the JCS portion that we're lacking implementers on.
We don't want to insist that You could also argue that System A should digitally sign its payload as well, which would include a signature over System B's payload (so you could trust the outer payload AND the inner payload). Talking about that sort of thing should probably be placed in the Security Considerations section, as talking about application-layer details a bit "out of scope" for the specification... but it's worth mentioning. Would one of you please raise an issue if you think this is worth noting in the specification?
Yes, this is true. The algorithms are written such that if the It is entirely possible that much of the above isn't easy to glean from the current specification, so if we need to add/update spec text, please raise an issue against the specification so we can fix this. |
@rflechtner In the example above I'm signing only the contents of
I would accept this risk. The majority of consumers are expected to be JSON processors, not aware of JSON-LD. In addition to that example, my software needs to support the following use cases:
https://www.w3.org/TR/vc-data-integrity/#context-injection says
but it is not entirely clear when and how to inject it. If context injection is a required part of the proof generation, it probably should be specified as an optional (?) step in https://www.w3.org/TR/vc-di-eddsa/#algorithms |
I think the use Assuming My experience from the Fediverse is: Resolving controller documents is hard. In the Fediverse the controller is called an actor and the public key is in the now deprecated |
@HelgeKrueger wrote:
Yes, exactly. That is why we only require
Yes, correct. We're trying to balance at least two things:
Excellent! Thank you for writing that tool. We are attempting to write similar testing tools for the W3C Verifiable Credentials ecosystem, and our hope is that some of these tools w/ overlap w/ Fediverse needs. As a related aside, this link is 404'ing, and I was hoping to see a list of conforming implementations there: https://funfedi.dev/support_tables/verify/ ... We define "Controller Documents" in https://www.w3.org/TR/vc-data-integrity/#controller-documents now, and we're thinking of moving the concept of "Actor/Controller Documents into their own standard at W3C: https://w3c.github.io/vc-controller-document/ (but may not be able to finish that work off during this round of the official Verifiable Credentials Working Group). |
I think we all agree on one thing here: the 'secured document', that is, the structure that is secured by a proof, is the one that has the {
"@context": [
"https://w3id.org/security/data-integrity/v2"
],
"content": "test",
"proof": {
"type": "DataIntegrityProof",
"cryptosuite": "eddsa-jcs-2022",
...
}
} In turn, if you sent the document from your example to me, I would be adding the
The VC data integrity specifications have been designed with interoperability between jcs & rdfc based proof methods in mind; as @dlongley stated initially, this was an explicit design goal. While this mechanism might not be particularly important for your use case, I think it is necessary to make this specification a success, as not having it would introduce nasty gotchas to this method of authenticating third party supplied data. And as outlined above, choosing not to implement it would mean that no other implementation would be able to verify the proofs you generated, which defeats the purpose of agreeing on a specification.
The first should be no issue as far as I can see; simply sign the embedded object first, and the top-level structure second. Since we've established that a
I agree that how to handle context injection is underdeveloped and needs to be made much more explicit in both the VC data integrity and the vc-di-eddsa specifications. That is why I opened issue w3c/vc-data-integrity#225, but I can open another one that specifically asks to add language that clarifies when context injection should happen (e.g., do we really perform it on verification, or should we just error if the expected context isn't present?), and open a third on https://github.com/w3c/vc-di-eddsa/ asking to explicitly point out that this step needs to be performed. |
Thank you for this explanation @rflechtner !
Did you mean
Yes, I'd appreciate if you open these additional issues |
That's one of the things that really should be properly defined in the specifications, especially if we treat adding the context on verification as allowable. Otherwise implementations may do this differently, resulting interoperability issues. I added this to the necessary clarifications in w3c/vc-data-integrity#231 |
PRs w3c/vc-di-eddsa#79, w3c/vc-di-eddsa#80, w3c/vc-di-ecdsa#61, and w3c/vc-di-ecdsa#62 were raised and merged to address this issue (discussion happened in those issues). This DB data-integrity library (really the cryptosuites) now need to implement this feature. I expect the libraries will fail the test suite until they properly implement the feature. I'm going to leave this open for now until we have a working set of cryptosuites that test successfully against the W3C test suite. |
@HelgeKrueger sorry for the late response here, have you managed to get into the test suite yet? We are planning on updating the Data Integrity tests very soon. Are you part of the test suite office hours? We do have support for implementers. Anyways thanks for the feedback and I will try to further digest this thread. |
Note: I'm opening an issue here as there seems to be a plan to phase out digitalbazaar/jsonld-signatures#178, but the problem we're facing concerns both libraries.
After having fixed an issue with jcs-based proofs not verifying properly in jsonld-signatures a few months back (digitalbazaar/jsonld-signatures#176), I want to circle back to the issue of adding context definitions to the document and proof before signature generation. My concern is that the current praxis of altering @context properties during proof generation and verification may simply not be viable for proof mechanisms that directly sign the json representation of linked data.
Specifically, I see two issues:
Adding a proof (suite) specific context to the document can invalidate pre-existing proofs. This praxis is adopted both in this library as well as in jsonld-signatures. Adding, for the purpose of supporting a wider range of verifiers, a Ed25519Signature2020 to a document that has already been signed with a DataIntegrityProof using eddsa-jcs-2022 will alter the document, after which the eddsa-jcs-2022 suite will no longer verify the existing proof.
Adding the document context prior to signing and verification, as done in this library and, inconsistently (only on verification), in jsonld-signatures seems to be at odds with the prescribed proof configuration of existing jcs based cryptosuite specifications authored by Digital Bazaar (https://www.w3.org/TR/vc-di-eddsa/#eddsa-jcs-2022 & https://www.w3.org/TR/vc-di-ecdsa/#proof-configuration-ecdsa-jcs-2019). Even though this behaviour can be patched for digitalbazaar/data-integrity by reimplementing
createVerifyData
in your cryptosuite, jsonld-signature's behaviour will break verification in this case, which leads into a dilemma where implementing the specs correctly actually harms interoperability between implementations.I'm hoping to start a thread here to discuss improvements that can be made to the way this is handled.
To kick this off, I can throw in two suggestions:
The text was updated successfully, but these errors were encountered: