Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fully remove mplex #10069

Closed
2 of 3 tasks
Tracked by #10045
Jorropo opened this issue Aug 15, 2023 · 9 comments
Closed
2 of 3 tasks
Tracked by #10045

fully remove mplex #10069

Jorropo opened this issue Aug 15, 2023 · 9 comments
Assignees

Comments

@Jorropo
Copy link
Contributor

Jorropo commented Aug 15, 2023

Done Criteria

  • mplex code and references are removed from Kubo. This includes updating any Kubo docs as well.
  • changelog entry with clear removal message
  • (TBD) discuss.ipfs.tech post

When this is done, a user will not have the option of enabling mplex in Kubo.

Why Important

TODO: link to the canonical libp2p issue/doc that describes the woes of mplex.

Notes

  1. This finishes what was started with turn mplex off by default #9958, where we disabled mplex by default.
  2. mplex code that was copied into Kubo for chore: update boxo, go-libp2p, and internalize mplex #10095 should be removed
  3. We should do a find/replace in the Kubo repo and ipfs/docs
  4. This is currently targeted for Kubo 0.24
  5. If you still need mplex, please share your usecase here.
@Jorropo
Copy link
Contributor Author

Jorropo commented Aug 15, 2023

I want this done for Kubo v0.24

@Jorropo Jorropo self-assigned this Aug 15, 2023
@Jorropo Jorropo mentioned this issue Aug 21, 2023
3 tasks
@Jorropo
Copy link
Contributor Author

Jorropo commented Aug 21, 2023

go-libp2p v0.30.0 dropped mplex support. I'm surprised to see that the mplex support was re-added with little discussion on the original issue: #9958.

Dropping mplex will be a prerequisite for updating go-libp2p, unless Kubo wants to copy-paste the now deleted mplex package (see libp2p/go-libp2p#2498 for discussion).

@marten-seemann

Given js-ipfs never got yamux and moving to helia isn't a simple straight forward process I want to have a 1 release buffer where mplex is disabled by default.
This gives a big enough window for peoples who still care about js-ipfs ←→ Kubo to add yamux to their js-ipfs config and start moving to helia.


Copy pasting code is always meh, but if it's just for one release I think it's fine. If someone finds bug in the mplex implementation we (Kubo maintainers) wont invest anytime fixing it given we are gonna remove it next month.

@marten-seemann
Copy link
Member

Given js-ipfs never got yamux and moving to helia isn't a simple straight forward process I want to have a 1 release buffer where mplex is disabled by default.

Can you explain where the number 1 is coming from? Helia has been out for a long time. Why does one month more or less make a difference now?

@Jorropo
Copy link
Contributor Author

Jorropo commented Aug 21, 2023

Because 1 is the smallest number that is not 0.

@marten-seemann
Copy link
Member

My question was: Given that Helia has been out for a long time, why is 0 not a valid number?

@Jorropo
Copy link
Contributor Author

Jorropo commented Aug 21, 2023

@marten-seemann because js-ipfs is still widely deployed.
Data from the bootstrappers:

> jq ".aggregations.group_by_event_type.buckets[] | select(.key | startswith(\"js-ipfs\"))" < unique-bootstrap-agents-20230823.json
{
  "key": "js-ipfs/0.17.0",
  "doc_count": 1583455,
  "unique_count": {
    "value": 269
  }
}
{
  "key": "js-ipfs/0.18.0",
  "doc_count": 1039349,
  "unique_count": {
    "value": 4312
  }
}
{
  "key": "js-ipfs/",
  "doc_count": 730116,
  "unique_count": {
    "value": 803
  }
}
{
  "key": "js-ipfs/0.16.1",
  "doc_count": 130080,
  "unique_count": {
    "value": 12
  }
}
{
  "key": "js-ipfs/0.15.4",
  "doc_count": 31252,
  "unique_count": {
    "value": 167
  }
}
{
  "key": "js-ipfs/0.11.1",
  "doc_count": 14905,
  "unique_count": {
    "value": 3
  }
}
{
  "key": "js-ipfs/0.18.1",
  "doc_count": 14869,
  "unique_count": {
    "value": 291
  }
}
{
  "key": "js-ipfs/0.9.1",
  "doc_count": 12316,
  "unique_count": {
    "value": 2
  }
}
{
  "key": "js-ipfs/0.8.0",
  "doc_count": 10867,
  "unique_count": {
    "value": 3
  }
}
{
  "key": "js-ipfs/0.10.8",
  "doc_count": 7189,
  "unique_count": {
    "value": 6
  }
}
{
  "key": "js-ipfs/0.10.6",
  "doc_count": 2507,
  "unique_count": {
    "value": 1
  }
}
{
  "key": "js-ipfs/0.14.1",
  "doc_count": 2120,
  "unique_count": {
    "value": 2
  }
}
{
  "key": "js-ipfs/0.13.0",
  "doc_count": 2076,
  "unique_count": {
    "value": 3
  }
}
{
  "key": "js-ipfs/0.12.2",
  "doc_count": 1304,
  "unique_count": {
    "value": 4
  }
}
{
  "key": "js-ipfs/0.14.3",
  "doc_count": 949,
  "unique_count": {
    "value": 2
  }
}

https://filecoinproject.slack.com/archives/C03KF8VCHV4/p1692608997316389?thread_ts=1692594783.607439&cid=C03KF8VCHV4

AFAIK doc_count is all captures while unique_count is uniqed-per-peerid.


In other words, over the last 14 days our bootstrappers have seen at least 5880 js-ipfs peerids:

> jq "[.aggregations.group_by_event_type.buckets[] | select(.key | startswith(\"js-ipfs\")) | .unique_count.value] | add" < unique-bootstrap-agents-20230823.json
5880

@marten-seemann
Copy link
Member

If that's the case, what makes you confident that this problem will be resolved within one release cycle then?

We could have kept mplex around (with a deprecation warning) in go-libp2p for longer, had this been surfaced in #9958 and libp2p/specs#553. The situation we're in right now is less than ideal.

@Jorropo
Copy link
Contributor Author

Jorropo commented Aug 21, 2023

One release cycle where users can opt back in wont make it perfect but it's better than zero. I don't know how difficult it is to add yamux to js-ipfs, I guess it's about a day of work max, having one escape hatch release let consumers decide when in the month they want to do that.
I wasn't aware of libp2p/specs#553, when #10070 was merged and #9958 (comment) posted libp2p/go-libp2p#2498 was not on v0.30.0's nor v0.31.0's planned changelogs, I wasn't aware you were just about to remove it.

Ideally Kubo would have turned off mplex a while ago but sadly my time machine is out of fuel right now.

We are gonna have our triage in a few hours, will make a decision then, but copying lgtm.

@BigLep BigLep mentioned this issue Aug 22, 2023
6 tasks
@BigLep
Copy link
Contributor

BigLep commented Aug 22, 2023

My understanding of where we have ended up is that:

I have updated the issue description with clear done criteria and some other notes.

@BigLep BigLep mentioned this issue Nov 9, 2023
11 tasks
@Jorropo Jorropo moved this from 🥞 Todo to 🏃‍♀️ In Progress in IPFS Shipyard Team Nov 18, 2023
Jorropo pushed a commit that referenced this issue Nov 18, 2023
Jorropo added a commit that referenced this issue Nov 18, 2023
Jorropo added a commit that referenced this issue Nov 18, 2023
Jorropo added a commit that referenced this issue Nov 22, 2023
Jorropo added a commit that referenced this issue Nov 22, 2023
Jorropo added a commit that referenced this issue Nov 22, 2023
@github-project-automation github-project-automation bot moved this from 🏃‍♀️ In Progress to 🎉 Done in IPFS Shipyard Team Nov 22, 2023
@lidel lidel mentioned this issue Nov 24, 2023
9 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
No open projects
Archived in project
Development

No branches or pull requests

3 participants