📨 Have you signed up to the Forum's new Email Digest yet? Get a selection of trending threads sent straight to your inbox daily, weekly or monthly!

Hit by Lloyds, Halifax, TSB and Bank of Scotland glitch? Your rights explained

Options
1235»

Comments

  • sandco
    sandco Posts: 19 Forumite
    dotdash79 wrote: »
    I'm sure a few weeks ago someone was on here comparing air plane availability with bank system which wasn't 24/7. People don't want banks to go off line to do essential maintenance / upgrades, yet aren't happy when outages happen.

    Planes and, life support machines have periods when they are not used to be maintained and quite a lot of plans have failed in the air.

    There is never ever a perfect time to do upgrades. There however is a more optimum time to do it. Normally very early hours of Saturday night Sunday morning.
    What surprises me this time, is the lack of backups or redundancy. Their CEO admitted it was an "HP server failing" so where's the backup? I would say that such critical operations would need 2nd or even 3rd line backup.

    Doesn't it always come down to the rule, if its built by man.......
  • gjchester
    gjchester Posts: 5,741 Forumite
    sandco wrote: »
    There is never ever a perfect time to do upgrades. There however is a more optimum time to do it. Normally very early hours of Saturday night Sunday morning.
    What surprises me this time, is the lack of backups or redundancy. Their CEO admitted it was an "HP server failing" so where's the backup? I would say that such critical operations would need 2nd or even 3rd line backup.

    Doesn't it always come down to the rule, if its built by man.......



    The outage was 3 hours they must have had a backup to get it back in service at that time. It wasn't an upgrade it was a failure but even planned upgrades can fail and have issues if the systems can't be reverted in time as was the case with RBS earlier last year.


    There may have been backups available and online. We are all assuming its a server that controller the transactions but it could have been a computer that controlled where the data was sent to failed, or the system that routed ATM and POS data to the transactional servers, so the computers that do the transactions were up but they were getting no data, we just don't know.


    Failure happen, you can plan for them and put in redundancy, but at some point there will be one part that cannot be duplicated for an instant failover. You also can often plan for system failures but its harder to plan for something cased outside the system, as the possibilities are far greater to consider.


    Should there have been better failover, possibly but without knowing what failed its hard to say what can be done to add resilience.
This discussion has been closed.
Meet your Ambassadors

🚀 Getting Started

Hi new member!

Our Getting Started Guide will help you get the most out of the Forum

Categories

  • All Categories
  • 351.2K Banking & Borrowing
  • 253.2K Reduce Debt & Boost Income
  • 453.7K Spending & Discounts
  • 244.2K Work, Benefits & Business
  • 599.2K Mortgages, Homes & Bills
  • 177K Life & Family
  • 257.6K Travel & Transport
  • 1.5M Hobbies & Leisure
  • 16.1K Discuss & Feedback
  • 37.6K Read-Only Boards

Is this how you want to be seen?

We see you are using a default avatar. It takes only a few seconds to pick a picture.