Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Namensaenderung von ebay-Kleinanzeigen auf Kleinanzeigen #32

Open
daaanieelq opened this issue May 26, 2023 · 9 comments
Open

Namensaenderung von ebay-Kleinanzeigen auf Kleinanzeigen #32

daaanieelq opened this issue May 26, 2023 · 9 comments

Comments

@daaanieelq
Copy link

daaanieelq commented May 26, 2023

Hey, vielen Dank erstmal fuer das script.
Leider ist es jetzt aber dazu gekommen, dass ein paar fehlermeldungen entstehen, sobald man die Links einfuegen moechte.
sobald ich den Befehl "ebAlert start" eingebe, erhalte ich eine Fehlermeldung. ( Webpage fetching error for url: https://www.kleinanzeigen.de/s-45257/clubmate/k0l2039r200 )

Ich weiss leider wirklich nicht, wie ich das umgehen kann. Waere wirklich nett, wenn mir jemand dabei helfen koennte.|

EDIT:
Habe gerade nochmal im Code nachgeschaut und das hier gefunden:

response = requests.get(self.link, headers=custom_header)
if response and response.status_code == 200:
return response.text
else:
print(f"<< webpage fetching error for url: {self.link}")

wenn ich's richtig verstehe, kann sich der Bot einfach keinen Zugriff zur Seite verschaffen?

@dandud100
Copy link
Contributor

@workinghard

@vinc3PO
Copy link
Owner

vinc3PO commented May 28, 2023

Hi,
I don't know exactly but I have tried with the link provided and everything seems to be working for me.
It's possible that ebay is starting to implement some sort of filtering. Depending on the number of request that are send.
#30

@workinghard
Copy link

workinghard commented May 28, 2023

It's possible that ebay is starting to implement some sort of filtering

This is something they are doing already for quite some time.
@daaanieelq: Wait until you get a new IP and try it again. Don't request more often then 1-2 links per 3 minutes. Otherwise your IP will get on a block list. No idea for how long.

@daaanieelq
Copy link
Author

i actually wasnt even able to run it once. I've set it up on a VPS Server using Linux, maybe thats the issue?

@workinghard
Copy link

i actually wasnt even able to run it once. I've set it up on a VPS Server using Linux, maybe thats the issue?

Means you have a static IP?

@cyberpete2244
Copy link

I just want to share my experiences.

  1. I have the script running on a VPS and had no issues in 2 months. Even through namechnages. I did not have to change anything.

  2. I run 50 searches per minute plus 150 every 5 minutes. No problems.

  3. While developing and testing changes to script I run it locally on my machine. About 2 month ago the IP blocking started. But I found what they check.. it's the headers. I have implemented an API which fetches randomized http headers for the webscraper every couple of searches. No more IP blocked messages.

Hope that helps.

@dandud100
Copy link
Contributor

3. I have implemented an API which fetches randomized http headers for the webscraper every couple of searches. No more IP blocked messages.

Maybe you could do a pr so Vinc3PO can merge them.

@cyberpete2244
Copy link

Maybe you could do a pr so Vinc3PO can merge them.

I will try to figure out how compatible it is with my overall changes. I have customized my fork of the script quite a bit by now. Maybe I can rather make a contribution to the master branch, maybe that is what you meant... I am new to GitHub.

@johnomon
Copy link

Maybe you could do a pr so Vinc3PO can merge them.

I will try to figure out how compatible it is with my overall changes. I have customized my fork of the script quite a bit by now. Maybe I can rather make a contribution to the master branch, maybe that is what you meant... I am new to GitHub.

that would be amazing!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants