• 4 Posts
  • 65 Comments
Joined 2 years ago
cake
Cake day: June 11th, 2023

help-circle

  • I used VMs some time ago but never managed to look deeper into separation of bare metal vs VMs. Hence I can’t assess this reasonably.
    Docker got me interested when it started and after discovering its networking capabilities I never looked back.
    Basically I’m trying to minimize the possibility that by intercepting one dockerized service the attacker is able to start interacting with all devices. And I have lots of devices because of a fully automated house. ;) My paranoia will ensure the constant growth of privacy and security :)




  • I’m somewhat paranoid therefore running several isolated servers. And it’s still not bulletproof and will never be!

    • only the isolated server, ie. no internet access, can fetch data from the other servers but not vice versa.
    • SSH access key based only
    • Firewall dropping all but non-standard ports on dedicated subnets
    • Fail2ban drops after 2 attempts
    • Password length min 24 characters, 2FA, password rotation every 6 months
    • Guest network for friends, can’t access any internal subnet
    • Reverse proxy (https;443 port only)
    • Any service is accessed by a non-privileged user
    • Isolated docker services/databases and dedicated docker networks
    • every drive + system Luks-encrypted w/ passphrase only
    • Dedicated server for home automation only
    • Dedicated server for docker services and reverse proxy only
    • Isolated data/backup server sharing data to a tv box and audio system without network access via nfs
    • Offsite data/backup server via SSH tunnel hosted by a friend





  • easeKItMAntoSelfhostedAdGuard/PiHole Blocklists merge duplicates
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    1 year ago

    If I’m understanding you correctly, you could make use of a shell script for this. Use WGET to download lists, then combine them into a single large file, and finally create a new file with no duplicates by using “awk ‘!visited[$0]++’”

    wget URL1 URL2 URL3
    cat *.txt > all.txt (This overwrites all.txt)
    awk ‘!visited[$0]++’ all.txt > no_duplicates.txt