Code and comments

    • palordrolap@kbin.social
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Back in the 80s/90s there were keyrings that would play an alarm if they heard a whistle at a particular frequency. You’re basically playing Marco Polo with your keys.

      I assume they lost popularity because the batteries tended to run out at inopportune times. Batteries are better now. Maybe it’s time those things made a comeback.

      • abadbronc@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        I remember those! I think the comeback version is the Tile or AirTag but I’m too old to hear them beep.

  • bizdelnick@lemmy.ml
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    1 year ago

    My 5 cents:

    1. When piping output of find to xargs, always use -print0 option of find and -0 option of xargs. This allows processing files with any allowed characters in names (spaces, new lines etc.). (However I prefer -exec.)

    2. There’s an i command to insert a line in sed, it is better to use it instead of s/^/...\n/. It makes code more readable (if we can talk about readability of sed code, huh).

    3. If you want to split a delimiter separated line and print some field, you need cut. Keep awk for more complicated tasks.

    • meteokr@community.adiquaints.moe
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      1 year ago
      1. If you want to split a delimiter separated line and print some field, you need cut. Keep awk for more complicated tasks.

      Depends on the delimiter too! For anyone else reading this, sed accepts many kinds of delimiters. sed "s@thing@thing2@g" file.txt is valid. I use this sometimes when parsing/replacing text with lots of slashes (like directory lists) so I can avoid escaping a ton of stuff.

      • bizdelnick@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        I know, but it is not the case I was talking about. I meant widely used commands like awk '{print $2}' that can be replaced with cut -f2.

        • meteokr@community.adiquaints.moe
          link
          fedilink
          arrow-up
          3
          ·
          1 year ago

          I know you know, as you already demonstrated your higher understanding. I just wanted to add a little bonus trick for anyone reading that doesn’t know, and is learning from your examples.

    • OmnislashIsACloudApp@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      agree with one and two and younger me would have agreed with your third point but I think I don’t anymore.

      yes cut is the simpler and mostly functional tool you need for those tasks.

      but it is just so common to need a slight tweak or to want to substitute something or to want to do a specific regex match or weird multi character delimiter or something and you can do it all easily in awk instead of having to pipe three extra times to do everything with the simplest tool.

  • SpaceNoodle@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    I’ve only ever found a use for sed once two decades into my career, and that was to work around a bug due to misuse of BigInt for some hash calculations in a Java component; awk remains unused. Bash builtins cover almost everything for which I find those are typically used.

    find and grep see heavy daily use.

    • meteokr@community.adiquaints.moe
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      That’s wild to me, as I used sed all the time. Quickly and easy changes in configs? Bam sed. Don’t even need to open vi when I can grep for what I need, then swap it with sed. Though I imagine more seasoned vi nerds would be able to do this faster.

    • palordrolap@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      If you’re using find all the time, check to see if you have or can have some variant of locate installed. It indexes everything* on the system (* this is configurable) and can be queried with partial pathnames, even with regex, and it’s fast.

      • SpaceNoodle@lemmy.world
        link
        fedilink
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        I use locate when I don’t know where the files are. Find has finer controls and can differentiate between regular files, links, directories, etc.

    • bizdelnick@lemmy.ml
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      sed is not for daily use, it is for reusable scripts. For other purposes interactive editors are more convinient.

  • t_378@lemmy.one
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    What software did you use to put the slide deck together? It seems to work so nicely when placed on a webpage, too…

    • arglebargle@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I don’t know what OP used, but it could be any one of the Markdown presentation tools.

      I like reveal.js

      Your presentation can go in git, looks good anywhere, and easily shared. It’s just html rendered.

  • aarroyoc@lemuria.es
    link
    fedilink
    arrow-up
    5
    arrow-down
    2
    ·
    1 year ago

    I always found “find” very confusing. Currently, I’m using “fd”, which I think has a more sensible UX

  • Papamousse@beehaw.org
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Using un*x since the 90s, this is all I know. I like awk but it can go fucking complicated, I once maintain a 5000 lines script that was parsing csv to generate JavaScript…

    • palordrolap@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      At that point I’d be looking for languages that have libraries that do what I need. Both Python and Perl have online repositories full of pre-written things. Some that can read CSV and others that can spit out JSON. It’s then a matter of bolting things together, which, hopefully, is a few lines of code rather than 5000.

      There are even awk repositories, but I’m not sure there’s a central, official one like PyPI or CPAN.

    • Joe KlemmerOP
      link
      fedilink
      arrow-up
      1
      arrow-down
      1
      ·
      1 year ago

      Someone used the wrong tool for the job. If an awk script gets more than a few dozen lines, it’s time to use another language/tool to process that data.

  • it_a_me@literature.cafe
    link
    fedilink
    arrow-up
    2
    arrow-down
    1
    ·
    1 year ago

    I’ve gotten tired of weird regex stuff in awk, sed, and grep, so I’ve moved to perl -E for all but the most basic of things.

    • bizdelnick@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      In most cases extended POSIX regexes are enough and looks the same as perl regexes.

      I also used perl until I needed to write highly portable scripts that can be run on systems without perl interpreter (e.g. some minimal linux containers). Simple things are also simple to do with grep/sed/awk, more complex things can be done with awk but require a longer code in comparison with perl.

      • SpaceNoodle@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        1 year ago

        I’ve dealt with systems that lack sed and awk. Bash builtins and other standard tools like cut and tr take care of … well, everything.

        • bizdelnick@lemmy.ml
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 year ago

          Systems with bash but without standard POSIX utils? I know some without bash (freebsd by default, busybox based distros etc.) and with grep, sed and awk, but not vice versa.