2023-10-02 10:46:42
2023-10-01 07:30:34
2023-10-01 07:29:40
1839574
24 hours later, I'm back!
Summary of the previous episodes:
- 3 days ago, my old dedicated server started spewing error messages about the CPU.
- On a hunch, I ran
backup-manager
with my existing configuration that's supposed to send the compressed database dump + files to an FTP backup server. - I contacted my hosting provider to figure out the error.
- Answer was quick: "we can't physically intervene on the server, and the same offer isn't available anymore, you have to pick a new offer".
- I pick an upgrade, no answer for a day.
- A day ago, they made the new server available to me, and immediately removed access to the old server from my account, including the attached FTP backup storage.
- Asked the support to restore access either to the old server or at least the FTP storage. At this point I've lost all my Internet history from 2009 onwards.
- The "Customer Excellence Specialist Level 2" throws his hands in the air, saying that the CPU errors are making either SSH or KVM impossible.
- 12 hours ago, they restored my access to the server and the FTP storage. No luck, last backup on that server dates back to 2019. Server's IP is "very pingable" according to support, but no remote connection is possible.
- Through the hosting provider console, I rebooted in rescue mode. No luck, still no remote connection. I switch back to regular mode with a new reboot.
- I start to receive emails from my third-party monitoring system that my web sites are available again!
- I jump on my terminal, the remote connection works! I don't know how much time I have, I stop the web server to save CPU cycles. I found the fresh local backup files. I start to download them locally. 25 GB at 500 kB/s, it's not looking good. I try to access my old server from my new server directly, transfer rate is 50 MB/s, we're getting somewhere.
- The archive is compressed with LZMA, but I'm unable to expand the archive, lzma reports an succint error.
- Pushing my luck, I start a full MariaDB database backup on the old server. Meanwhile, I start copying the old server's files onto the new using
scp
. Wrong choice, permissions and symbolic links aren't preserved. Started again withrsync
with better success. - Importing the database backup was straightforward, works right out of the box with the already existing user credentials.
- I finish importing the files, fire the web server after fixing several unmet dependency issues, and here I am again!
Phew. Did you miss me?
luka prince lucija
•Hypolite Petovan
•luka prince lucija
•Hypolite Petovan
•@social elephant in the room After 15 years administering a dedicated server, I have yet to find a way to set up a backup strategy that I can then forget. If it requires any manual intervention from me or interrupts the service I host for the slightest time, I will eventually drop it.
Now, even with an all-in-one script like
backup-manager
, I ended with a backup compressed in LZMA that I mysteriously couldn't open on a different machine, and it wasn't uploaded to the FTP destination I gave it, probably because it was short on space, but I wasn't informed about the issue.So I really don't know what to do about it. I've looked into MariaDB incremental backups to reduce the backup process load, but it can't easily be automatized and restoring a full backup requires as many steps as they were incremental backups, which makes it absolutely unworkable.
Hypolite Petovan
•@social elephant in the room Thank you for your recipe, however I personally don't have a separate machine to attempt any restore. Also, it would require a manual intervention on my part, which I wouldn't do.
And I would tune out the daily emails real quick. I enjoy Uptime Robot because it only sends an email when the monitored websites are down (or up again).
Backing up files locally is easy enough to automate with rsync, but I still need a practical solution for database that doesn't involve a full SQL dump since the performance impact is sizable. I also need a solution that creates a few days worth of backups and then clean up after itself.
backup-manager
is supposed to do so, but the absence of alert when the process doesn't complete as expected is unnerving.Шуро
•> I have yet to find a way to set up a backup strategy that I can then forget
I am doing some backups at work as part of my duties for years now and I am pretty sure no such thing exists. Everything fails and gets stuck, health checks included.
At home I also use backup manager but just to generate backups and then my own script to transfer them (and another one to send notification if the resulting files aren't produced).
However broken backups still happen, I was in similar situation when my machine got bad RAM. Everything worked for a while but not quite and while it produced archives just fine they were broken.
Russ O ❤ 🇺🇦
•Heroic exploits indeed. Good work on the recovery.
There are two adages that spring to mind:
1) you can never have enough backups
2) if you don't test your backups, they don't work
Hypolite Petovan likes this.
Brad Koehn
•Hypolite Petovan likes this.
Brad Koehn ☑️
•Hypolite Petovan likes this.
Hypolite Petovan
•Hypolite Petovan
•@Александр @social elephant in the room
I am not surprised, but still disappointed. It just doesn't work with my own mindset. I have infrequently tried to set something up, but each time the drawbacks ended up being greater than even the potential to recover from complete loss.
I've come to live with the risk and pre-accept potential losses. It still doesn't feel great, but it just doesn't work otherwise.
Hypolite Petovan
•rsync
worked wonderfully and DNS switch was quick to the new server's IP.Brad Koehn ☑️
•I’m using Ivory as a client.
Hypolite Petovan
•