• corytheboyd@kbin.social
    link
    fedilink
    arrow-up
    23
    ·
    10 months ago

    Strings became ubiquitously used for a reason, they map really clearly to the way we think as humans. Most importantly, when you’re debugging, seeing string data is much friendlier than whatever data your symbols map to (usually integers, from enum structures)

    No, obviously it’s not the most efficient thing in the world, but it hardly matters, and you’re not getting anyone to stop because you’re “technically right”.

    • Pipoca@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      10 months ago

      Symbols display with friendly string-y names in a number of languages. Clojure, for example, has a symbol type.

      And a number of languages display friendly strings for enumy things - Scala, Haskell, and Rust spring to mind.

      The problem with strings over enums with a nice debugging display is that the string type is too wide. Strings don’t tell you what values are valid, strings don’t catch typos at compile time, and they’re murder when refactoring.

      Clojure symbols are good at differentiation between symbolly things and strings, though they don’t catch typos.

      The other problem the article mentions is strings over a proper struct/adt/class hierarchy is that strings don’t really have any structure to them. Concatenating strings is brittle compared to building up an AST then rendering it at the end.

      Edit: autocorrect messed a few things up I didn’t catch.