TLDR; do you know of any general purpose languages that can also compile a function to some representation of AND/OR gates (or NAND gates, or whatever)?

Edit: actually any algebra/formal-logical system is also fine (not just boolean algebra).

Yes, a A LOT of additional info is needed, like defining how input/output is defined, and I am interested in how those would be specified. I’m not interested in printing an actual circuit, just the boolean-logic level. And I’m mostly asking because I feel like most compilers can’t generate a clean/mathematical representation from their AST. There’s AST to IR, there’s hard-coded optimizations on the IR, and then there’s hard-coded mappings from the IR to assembly, but at no point (AFAIK) is the code turned into a algebraic/logical system where something like De Morgan’s Law can be applied. And that seems really sad to me.

So you could say my real question is: what compilers have a strong logical/algebraic internal representation of their own AST?

Maybe something like Haskell or Prolog do this. The Wolfram Language almost certainly does but it’s closed source.

  • chaotic_disorganizer@feddit.de
    link
    fedilink
    arrow-up
    11
    ·
    1 year ago

    You will need a hardware description language (HDL). Verilog and VHDL are two very established ones but they are tricky. A newcomer is Bluespec, which is now open source so if you wanna go down that rabbit hole, I’d recommend this one.

    • jeffhykin@lemm.eeOP
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 year ago

      Sorry, I meant a general language rather than one that is described as a “hardware description language”. I went ahead and edited the post to be more clear about that.

      Thanks for the info though as others might still be interested in it!

      • Tubbles@lemmy.world
        link
        fedilink
        arrow-up
        8
        ·
        1 year ago

        The most low level languages, such as C, compiles down to CPU instructions, which still is way above logic gates. The CPU in turn reads the instructions and controls the computer to in a way “simulate” what could be described as a boolean expression – at every CPU clock cycle. The next cycle the permutation of all control signals and computer compinents will be different. I highly doubt any programming language implementation has an IR that resembles what you are looking for, including mathematica. The closest you get is probably HDLs but then you need to do all the mathing yourself

        • jeffhykin@lemm.eeOP
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 year ago

          And with so much stuff being built ontop of C (or at minimum LLVM) I was afraid that would be the case.

          I was kinda hoping there would be some hacky compiler that could take a C function like:

          short doStuff(short a, short b) {
               short c = a * b;
               short d = c + a; 
               return d;
          }
          
          /* A more realistic example would be something like AES/DES/DEA  */
          
          • Create boolean inputs for the bits in a and b, and an boolean outputs for c
          • Have mappings/templates for converting multiply and add into boolean logic expressions
          • Simplify/compress/optimize the expressions
          • Print out some kind of expression or maybe graph representation of the logic gates and connections
          • Tubbles@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            1 year ago

            I can highly recommend you have a look at some HDL languages, eg Verilog can look roughly like your example and synthesizes down to logic elements

            • jeffhykin@lemm.eeOP
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              Cool I’ll give them I shot! I’ll admit I haven’t heard great things about typical HDL langs, so I haven’t looked into them much

          • Tubbles@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            Another way could be to run that through a compiler with optimization activated, and then decompile the resulting binary back to code. But if you want to optimize hot code then usually mathematical reduction is seldomly wherein the problem lies

            • jeffhykin@lemm.eeOP
              link
              fedilink
              arrow-up
              2
              ·
              edit-2
              1 year ago

              I don’t know about math reduction not being the bottle neck. If I was custom optimizing hot code then yeah, cache hit optimization is huge, but I’m thinking of generic optimizations on hot code that only the compiler looks at. Beyond out-of-order-execution and SIMD kind of algebraic shuffling. For example, I want to be confident that the compiler would transform something like

              for each in range(x) x += x

              into x*=x+1

              And based on stuff like this (which is shockingly recent IMO) I don’t think modern compilers can even find that shortcut right now. Which is kinda sad when you think about it.

              If x=65536, any non-algebraic optimizaiton would be vastly inferior. And sure an experienced dev wouldn’t make this kind of mistake, but I bet half the code running on the average computer wasnt written by experienced devs. And its not like its an either-or situation, we can do both optimization steps.