More than half a million UniSuper fund members went a week with no access to their superannuation accounts after a “one-of-a-kind” Google Cloud “misconfiguration” led to the financial services provider’s private cloud account being deleted, Google and UniSuper have revealed.

  • wise_pancake@lemmy.ca
    link
    fedilink
    English
    arrow-up
    104
    ·
    6 months ago

    The most surprising thing here is they got in contact with a human in Google cloud to resolve the issue.

    • SorteKanin@feddit.dk
      link
      fedilink
      English
      arrow-up
      28
      ·
      6 months ago

      Imagine this happens to some random personal account… It’d probably be gone for good.

      • wise_pancake@lemmy.ca
        link
        fedilink
        English
        arrow-up
        11
        ·
        6 months ago

        There were several months with people complaining their data was getting deleted and Google just ignored the whole thing until it blew up on hacker news.

  • Kid_Thunder@kbin.social
    link
    fedilink
    arrow-up
    57
    ·
    6 months ago

    And the crazy part is that it sounds like Google didn’t have backups of this data after the account was deleted. The only reason they were able to restore the data was because UniSuper had a backup on another provider.

    This should make anyone really think hard about the situation before using Google’s cloud. Sure, it is good practice and frankly refreshing to hear that a company actually backed up away from their primary cloud infrastructure but I’m surprised Google themselves do not keep backups for awhile after an account is deleted.

    • towerful
      link
      fedilink
      English
      arrow-up
      33
      arrow-down
      1
      ·
      6 months ago

      Actually, it highlights the importance of a proper distributed backup strategy and disaster recovery plan.
      The same can probably happen on AWS, Azure, any data center really

      • Kid_Thunder@kbin.social
        link
        fedilink
        arrow-up
        17
        arrow-down
        2
        ·
        6 months ago

        Actually, it highlights the importance of a proper distributed backup strategy and disaster recovery plan.

        Uh, yeah, that’s why I said

        it is good practice and frankly refreshing to hear that a company actually backed up away from their primary cloud infrastructure

        The same can probably happen on AWS, Azure, any data center really

        Sure, if you colocate in another datacenter and it isn’t your own, they aren’t backing your data up without some sort of other agreement and configuration. I’m not sure about AWS but Azure actually has offline geographically separate backup options.

        • flop_leash_973@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          6 months ago

          I use AWS to host a far amount of servers and some micro services and for them if you don’t build the backup into your architecture design and the live data gets corrupted, etc you are screwed.

          They give you the tools to built it all, but it is up to you as the sysadmin/engineer/ dev to actually use those tools.

    • wise_pancake@lemmy.ca
      link
      fedilink
      English
      arrow-up
      21
      ·
      6 months ago

      The IT guy who set up that backup deserves a hell of a bonus.

      A lot of people would have been happy with their multi region resiliency and stopped there.

      • Kid_Thunder@kbin.social
        link
        fedilink
        arrow-up
        9
        ·
        edit-2
        6 months ago

        Google Cloud definitely backs up data. Specifically I said

        after an account is deleted.

        The surprise here being that those backups are gone (or unrecoverable) immediately after the account is deleted.

  • Stern@lemmy.world
    link
    fedilink
    English
    arrow-up
    38
    ·
    6 months ago

    You just know the IT guy who restored it was like, “Y’ALL REAL QUIET WITH THAT ‘WHAT DO YOU EVEN DO HERE’ SHIT.”

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    6 months ago

    This is the best summary I could come up with:


    More than half a million UniSuper fund members went a week with no access to their superannuation accounts after a “one-of-a-kind” Google Cloud “misconfiguration” led to the financial services provider’s private cloud account being deleted, Google and UniSuper have revealed.

    Services began being restored for UniSuper customers on Thursday, more than a week after the system went offline.

    Investment account balances would reflect last week’s figures and UniSuper said those would be updated as quickly as possible.

    In an extraordinary joint statement from Chun and the global CEO for Google Cloud, Thomas Kurian, the pair apologised to members for the outage, and said it had been “extremely frustrating and disappointing”.

    “These backups have minimised data loss, and significantly improved the ability of UniSuper and Google Cloud to complete the restoration,” the pair said.

    “Restoring UniSuper’s Private Cloud instance has called for an incredible amount of focus, effort, and partnership between our teams to enable an extensive recovery of all the core systems.


    The original article contains 412 words, the summary contains 162 words. Saved 61%. I’m a bot and I’m open source!

  • 0x0
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    6 months ago

    Clouds… clouds everywhere…