• yojimbo@sopuli.xyz
    link
    fedilink
    arrow-up
    17
    ·
    edit-2
    1 year ago

    A dirty linux admin here. Imagine you get ssh’d in nginx log folder and all you want to know are all the ips that have been beating againts certain URL in around last let’s say last seven days and getting 429 most frequent first. In kittie script its like find -mtime -7 -name "*access*" -exec zgrep $some_damed_url {} \; | grep 429 | awk '{print $3}' | sort | uniq -c | sort -r | less depends on how y’r logs look (and I assume you’ve been managing them - that’s where the zgrep comes from) should be run in tmux and could (should?) be written better 'n all - but my point is - do that for me in gui

    (I’m waiting ⏲)

    • immortalgeek@programming.dev
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      As a general rule, I will have most of my app and system logs sent to a central log aggregation server. Splunk, log entries, even cloudwatch can do this now.

      But you are right, if there is an archaic server that you need to analyse logs from, nothing beats a find/grep/sed

    • Hobo@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      In splunk this is a pretty straightforward query and can be piped to stats count and sorted. I don’t know if you’d exactly count that as gui though.