Any option to share the end txt file for none bash capable users?....So we can add it to the lists on the config page.
Or add this feature into pihole with on/off toggle....lets go a step further.
Lets combine forces and upload all individual found lists. Combine and reshare?
yes... i think this is a nice idea.
unsure how to collect all the different submissions, i do this with 3 friends of mine through rsync
anyone have any suggestions on how to combine or at least host the different files ( i can combine via scripts )
updated every 10 mins.... but like i mentioned before, currently there is an issue with the dnsdumpster.com website, it craps out on googlevideo.com - so files are very small/empty.
We (you:)) could create/change the/a script that copies/renames the local files to something with a random name and upload every 24 hours..? Combing => pihole -g
Good question: where to upload...making something open will result in chaos or abuse.
root@Pi-Hole-Server:/etc/pihole# ./youtube-ads.sh
Unexpected status code from https://dnsdumpster.com/: 500
Traceback (most recent call last):
File "/etc/dnsdumpster/dnsdumpster/ADS_youtube.py", line 10, in
print(res['domain'])
TypeError: list indices must be integers, not str
grep: /var/log/pihole.log.1: No such file or directory
root@Pi-Hole-Server:/etc/pihole#
edit : i modified the youtube ads script slightly to look at /var/log/pihole.log.1 and pihole.log.2.gz (running 2 days worth of youtube queries) - get a bit more in the resulting file, unsure that they change this daily...
but it seems to be working a little better for me.
that github link is really dated, and not updated, it seems.
since they change those domains often, i do not use it. just sayin, ymmv.
as to the other entries - i added two lines to my shell script :
after this line ...
grep r*.googlevideo.com /var/log/pihole.log | awk '{print $6}'| grep -v
^googlevideo.com|redirector.googlevideo.com' | sort -nr | uniq >> /var/www/html/youtube-ads-list.txt
Luckily the site has an API to which makes downloading a list of DNS records simple using something like curl. The API is designed to be used in an ad-hoc fashion not for bulk queries and is limited to 100 (total) requests from a single IP Address per day.
Quick word of warning, I'm using this script on a pfSense server that's using pfblocker for ad domain blocking. Whilst the script uses standard FreeBSD/Linux type commands, the paths to those commands maybe different when run on a pi-hole.
Lines 1 - 3 clear down text files, as per previous scripts above
Line 4 downloads the DNS records from api.hackertarget.com in comma separated variables format: domain,IP address
Line 5 removes the second entry per line (the IP address)
Line 6 removes the top entry from the resulting file. This just has the text "googlevideo.com" in it
Done!
As I say, works for me under pfSense with pfblocker, so should work under pi-hole with minor modification
As long as you added http://pi.hole/youtube-ads.txt to your block list, then yeah your steps would work. However, I have extreme doubts about the domains being provided being ad-serving domains - therefore, you could be very likely blocking access to legitimate content.
The only way I know to verify an ad-serving content server is by coming across it yourself and verifying the host via YT's "Stats for nerds" panel.
Friend, that is a url that goes directly to your pi hole. If that doesn't load that means you don't have the file in your html /var/www/html/.. directory