Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Big MYSQLDUMP containing only one table #24

Open
chinmoybasak opened this issue May 24, 2024 · 7 comments
Open

Big MYSQLDUMP containing only one table #24

chinmoybasak opened this issue May 24, 2024 · 7 comments

Comments

@chinmoybasak
Copy link

I have a 40GB MYSQLDUMP contain only one table. ALLTABLES does not work. Is there anyway In split the datas?

@ramonfincken
Copy link

each row has 1 insert?

@chinmoybasak
Copy link
Author

Thank you for reply . No I think each row has multiple inserts.

@ramonfincken
Copy link

if you could -> make the export (will be bigger of course) 1 line = 1 insert
then you can easlily split the file

@chinmoybasak
Copy link
Author

I have exported as ALTABLES, the biggest gz file is 10GB other are tiny, so are you saying now it has 1 line = 1 insert? And how can I segregate that one? Can you please share?

@kedarvj
Copy link
Owner

kedarvj commented May 25, 2024

Hi @chinmoybasak,
If you have single table of 40G, WHY do you want to split the the inserts into multiple files? Note that multi inserts are faster than single line insert.

Now, if you want to split, you rather can note that there would be multiple INSERT statements. You can confirm this by checking

head -1000 BACKUP.sql | grep INSERT | wc -l

If you want to split all the INSERTs in the batch of 10 per file you can use

awk 'BEGIN {i=1} /^INSERT/ {file="nittywitty_" sprintf("%02d", int((i-1)/10)+1) ".sql"; print >> file; close(file); i++}' BACKUP.sql

You will have number of files there... Consider tasting before copy-pasting the code ;)

Thanks,
Kedar.

@kedarvj
Copy link
Owner

kedarvj commented May 26, 2024

@chinmoybasak,
I read your comment re: "testing my codes to work on the database". If you're testing "only" you don't need the whole dump! Just dump a few lines and complete your testing.
In your mysqldump you may consider using --where true limit 100 to only dump 100 records of the full table and then perform your tests!
If you're going to load 40G data to MySQL it is going to take time!

Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants