How to increase the drop in mAP cause by Robust DPatch? #2181
Unanswered
dominic-simon
asked this question in
Q&A
Replies: 1 comment 1 reply
-
did you solve this problem? I have a same problem |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello,
I'm trying to use the Robust DPatch to attack images from COCO. I am able to attack images such that many hallucinations appear, but all hallucinations have a very low confidence (< 20%, usually ranging from 5% - 10%). These low confidence hallucinations do not have much of an effect on the Mean Average Precision (mAP), which is the main metric I am using to measure the success of the attack. Is there any way I can increase the confidence of the hallucinated objects so that the mAP of the image, and whole dataset, will decrease?
For reference, here is the code I'm using to generate the patches:
I use pycocotools to measure the mAP. I can provide the script where I do that if necessary. Any recommendations are appreciated. Also, if there is a different object detection patch I can try, I am open to that as well.
Finally, if someone working on this repo knows, are there any plans to add an object detection patch that hides objects in the image from the detector (a "disappearing" patch)? That would be very useful to have.
Thank you for your consideration.
Beta Was this translation helpful? Give feedback.
All reactions