Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New Feeds endpoint returns feeds that By Feed ID endpoint does not recognize #88

Open
filterdo opened this issue Dec 21, 2022 · 3 comments

Comments

@filterdo
Copy link

I am getting data from https://api.podcastindex.org/api/1.0/recent/newfeeds?max=1000&feedid=5902722

Then I try to get the details for those new feeds at https://api.podcastindex.org/api/1.0/podcasts/byfeedid?id=5902723.

5902723, 5902726, 5902971, 5902737, 5902969, 5902962, 5903003, 5903014, 5902994, 5903044, 5903054, 5903063, 5903042 and 5903056 all return:

{
status: 'true',
query: { id: '5902723' },
feed: [],
description: 'No feeds match this id.'
}

Were these created and then deleted? If so (or even if not), how can you tell when feeds have been deleted from podcastIndex?

@stevencrader
Copy link
Collaborator

You can see dead feeds (what have been removed from the index) at https://api.podcastindex.org/api/1.0/podcasts/dead?pretty or https://public.podcastindex.org/podcastindex_dead_feeds.csv for a CSV.

I don't see 5902723 or 5902726 in the return though. I didn't check all of them.

@daveajones Any ideas?

@filterdo
Copy link
Author

@daveajones

These are all of the ids that currently return:

{
status: 'true',
query: { id: '5902723' },
feed: [],
description: 'No feeds match this id.'
}

5910737,5904781,5904702,5908701,5908705,5904771,5908706,5904086,5904839,5904917,5904836,5908799,5908708,5908696,5904897,5904918,5908792,5910725,5904929,5904899,5910704,5904770,5910701,5908707,5904904,5910713,5904835,5911464,5910692,5910721,5910720,5911466,5910998,5910993,5910726,5911457,5913872,5910716,5908725,5913831,5908697,5913863,5908703,5904898,5911451,5911469,5911001,5913844,5913869,5911454,5913773,5913751,5910722,5910989,5913690,5911455,5911449,5911360,5906734,5913154,5910984,5910738,5913815,5913807,5919307,5909910,5904775,5914812,5911470,5908698,5904787,5915166,5911458,5906737,5913832,5907048,5913848,5904785,5921067,5906723,5913253,5917748,5910985,5907045,5911472,5911461,5913243,5913235,5904748,5917749,5912884,5906725,5916383,5906728,5916497,5922864,5916487,5922869,5906046,5904705,5906716,5922546,5906747,5922833,5922870,5906051,5922237,5922830,5922214,5919233,5922839,5916374,5916384,5921064,5922918,5922637,5913148,5922630,5922217,5921681,5906705,5923576,5922631,5923604,5919305,5922835,5923587,5915337,5922868,5911460,5922873,5919295,5904779,5917750,5916387,5916391,5922872,5922545,5923595,5913117,5913130,5926064,5922678,5923583,5916388,5922221,5916392,5923584,5922222,5923596,5906717,5923593,5915335,5923575,5929767,5906757,5906729,5923602,5906730,5906743,5906714,5906746,5906715,5913837,5906727,5920093,5926238,5921065,5916371,5906736,5922634,5922627,5906724,5923581,5923588

@daveajones
Copy link
Contributor

What you are seeing is the normal purging process. We accept feeds as is initially and then have processes that come along later to clean up and delete feeds that are bogus, scams, etc.

Looking at the API logs you are hitting the API hard. You really don’t need that level of immediacy. It’s built to allow a slower rate of tracking unless you have a use case I’ve never seen before. If you are just tracking new feed data then just hitting every few minutes should be more than sufficient.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants