Hello! I’m writing a Qt/c++ program that should write and read data on a file, while another process does the same. What file format should I use to prevent conflicts?

  • ENipo@lemmy.world
    link
    fedilink
    arrow-up
    21
    ·
    1 year ago

    No file format will save you by itself since at the end of the day, xml, json, etc they are all plain text.

    One solution would be to use some database (like sqlite if you are just doing things locally on your computer).

    Another solution would be to have a single process that is responsible for writing the data, while all other processes send messages to it via a queue.

      • sip
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        sounds good, but you need to look into races.

        if program A used what it reads to write back and program B writes in the meanwhile A’s work won’t be ok.

        transactions solve this.

  • stifle867
    link
    fedilink
    arrow-up
    10
    arrow-down
    1
    ·
    1 year ago

    The format has nothing to do with it. If two processes are writing to the same file at the same time it will be corrupted. You can either figure out a solution with locks (one process requests a lock on the file, the other process has to wait), or look into using SQLite which is most likely a better option if you can.

    There are other solutions, such as writing to temp files. If you post more information it will be easier to give you the right advice. Why do more than one process need to write to the same file at the same time?

    • tatterdemalion
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      1 year ago

      If two processes are writing to the same file at the same time it will be corrupted.

      That’s not true. You just need to make sure the processes are writing to disjoint regions of the file. I wouldn’t recommend it unless you have a good reason though.

      • stifle867
        link
        fedilink
        arrow-up
        4
        ·
        1 year ago

        It still can corrupt due to certain factors like writes not immediately being propagated to disk. In theory it’s possible just harder than you’d think.

  • JoeyJoeJoeJr@lemmy.ml
    link
    fedilink
    arrow-up
    5
    ·
    1 year ago

    Can you describe your use case more?

    I don’t think format matters - if you’ve got multiple processes writing simultaneously, you’ll have a potential for corruption. What you want is a lock file. Basically, you just create a separate file called something like my process.lock. When a process wants to write to the other file, you check if the lock file exists - if yes, wait until it doesn’t; if no, create it. In the lock file, store just the process id of the file that has the lock, so you can also add logic to check if the process exists (if it doesn’t, it probably died - you may have to check the file you’re writing to for corruption/recover). When done writing, delete the file to release the lock.

    See https://en.wikipedia.org/wiki/File_locking, and specifically the section on lock files.

    This is a common enough pattern there are probably libraries to handle a lot of the logic for you, though it’s also simple enough to handle yourself.

  • hblaub
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    In most cases, an IPC mechanism should be better. You open up named pipes (which can be anything, so why not JSONL) for two way communication, and use that. Or shared memory of some other kind. Only one of the processes would actually write the file to disk once in a while, to avoid on-disk corruption and allow the saving program to do checks and corrections.

    But if really needed, here are some thoughts: Theoretically, you could write JSONL (JSON Lines), because that’s simple and human-readable in cases of debugging. If process A wants to write a new line, a new event, for example, it has to check for “'\r\n” at the end, and write-append it in one go. Process B gets the filesystem event that the file changed and reads it. It would be better of course, to create a simple 0-bytes “.lock” file while the writing is going on and removed it afterwards. To avoid corruption, you could also create checksums and whatever. Depending on your use case, that can work. I mean, I open one file sometimes in two editors and you can loose of course modifications that you did while it reloads the file from the other editor.

  • coltorl
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    1 year ago

    For a local application this sounds like a poor design choice to have multiple file writing processes (edit: I guess I assumed you would write to a single file). You can have those processes write to a queue instead and have a 3rd process write the queue to a file (or have a chosen process of the two to write).