This is my first try at anything open source so any feedback is welcome :)

  • Sem@lemmy.ml
    link
    fedilink
    English
    arrow-up
    4
    ·
    6 months ago

    Is it a wrapper on top of datafusion, but with an ability to define transformations in yaml? I mean it looks really cool, I’m just trying to understand the usecase

    • katoOP
      link
      fedilink
      arrow-up
      5
      ·
      6 months ago

      Basically yes. The usecases I have found so far at work is to build an API around this to dynamically register automatic reports for data analysts, clients and non devs. In general this also greatly speeds up dev time for any ETL that we need to deploy (am part of a data engineering team). Another usecase I found is that using the CLI tool we can create run books for our SRE team to run queries for debugging/data validation purposes. I think we’ll find more as we go but another part of it was to simplify working with datafusion and deltalake as their APIs expose a lot of lower level stuff.

  • spacecadet@lemm.ee
    link
    fedilink
    arrow-up
    1
    ·
    6 months ago

    Very similar to something I’ve been working on but this seems better (I’m still a rust programming newb). I am essentially making a DBT clone but in rust so it doesn’t take 6 hours to do a DBT run and focuses more on Rust managing and transforming the data.

    • katoOP
      link
      fedilink
      arrow-up
      1
      ·
      6 months ago

      Oh no worries I am quite new to rust myself but am lucky to be able to use it at work and already got some experience with datafusion and delta-rs :). Accessing postgresql with this is not supported yet but am trying to figure out using OpenDAL for that which should hopefully make it quite easy to implement

      • spacecadet@lemm.ee
        link
        fedilink
        arrow-up
        1
        ·
        6 months ago

        What was your programming experience before rust? I’m coming from Python so at lot of times I feel like I’m learning from scratch but in a good way. I’m glad rust is teaching me how to break bad habits.

        • katoOP
          link
          fedilink
          arrow-up
          1
          ·
          6 months ago

          I have a couple of years of experience writing functional scala as a backend web dev and switched to doing data engineering 2 years ago. Before that some C/C++ (this is where my rust interest came from).

          I definitely understand the feeling of learning from scratch, I had the same experience learning functional programming but having learnt that made learning rust much easier

    • katoOP
      link
      fedilink
      arrow-up
      5
      ·
      6 months ago

      ETL stands for extract transform and load and it is a widely used architecture for data pipelines where you load some data from different sources (like an S3 or gcs bucket), apply some transformation logic to either aggregate the data or do some other data transformation like changing the schema and then output the result as a different data product.

      These pipelines are then usually run on a schedule or triggered to periodically output data for different time periods to be able to deal with large sets of data by breaking them down into more manageable pieces for a downstream data science team or for a team of data analysts for example.

      What this library is aiming at is to combine the querying capabilities of datafusion which is a query parser and query engine, with the delta lake protocol to provide a pretty capable framework to build these pipelines in a short amount of time. I’ve used both datafusion and delta-rs for some time and I really love these projects as they enable me to use rust in my day job as a data engineer which is usually a python dominated field.

      However they are quite complex as they cover a wide variety of usecases and this library tries to reduce the complexity using them by constraining them for the use case of building simple data pipelines.