Best Practice - storing large amounts of data

Quick to answer questions about finding your way around Linux Mint as a new user.
Forum rules
There are no such things as "stupid" questions. However if you think your question is a bit stupid, then this is the right place for you to post it. Stick to easy to-the-point questions that you feel people can answer fast. For long and complicated questions use the other forums in the support section.
Before you post read how to get help. Topics in this forum are automatically closed 6 months after creation.
3pinner
Level 3
Level 3
Posts: 128
Joined: Mon Mar 05, 2012 8:47 pm

Best Practice - storing large amounts of data

Post by 3pinner »

I'm looking for suggestions from those that have a LOT of data on their computer.
I'm setting up a new system for my wife. she has thousands of photos.
trying to decide if I want her to continue adding photos as part of her /home folder (on a 4tb drive)
or
installing a separate drive specifically for photos.
Seems backup would be easier if they were part of the /home folder. I could use Timeshift or Lucky Backup for that.

If I have a second drive for photos, I'd probably schedule two backups, one for system and/home (Timeshift for this one)
Lucky Backup to a different location for the photos, and set it up to backup only new & changed files.

Or am I making this too complicated??
What do you use for an insane amount of data?
Thanks
Last edited by LockBot on Sun Dec 03, 2023 11:00 pm, edited 1 time in total.
Reason: Topic automatically closed 6 months after creation. New replies are no longer allowed.
mikeflan
Level 17
Level 17
Posts: 7096
Joined: Sun Apr 26, 2020 9:28 am
Location: Houston, TX

Re: Best Practice - storing large amounts of data

Post by mikeflan »

I say you are making this too complicated. The answer to your question depends on a few things. One being if you are using desktops, or all computers are exclusively laptops? Another is how much data do you have?

I use desktops mostly. I plug in a 14 TB drive - one giant EXT4 partition - and copy my 9.6 TB of data to it. That takes 12 - 24 hours. Then I am all done. The drive is always mounted and always available for access. Then (very importantly) I do that again to my two 12 TB Easystore drives ($200 US each). Those are for backup only, except I use them to restore that data to other computers that have large spinning drives. By the time I do all that I have at least 6 copies of all my data in different places.
Hoser Rob
Level 20
Level 20
Posts: 11796
Joined: Sat Dec 15, 2012 8:57 am

Re: Best Practice - storing large amounts of data

Post by Hoser Rob »

Timeshift is only meant for backing up root, it's not meant for general backups including data in /home. Though there's an option for backing up the hidden config files in /home, which I always do.
For every complex problem there is an answer that is clear, simple, and wrong - H. L. Mencken
3pinner
Level 3
Level 3
Posts: 128
Joined: Mon Mar 05, 2012 8:47 pm

Re: Best Practice - storing large amounts of data

Post by 3pinner »

thank you both!
For this purpose - using one desktop. I have a 4tb drive for her photos, and will add a backup drive later.

So I'm now thinking starting off with - one drive for the OS and home folder and timeshift (for backing up root and hidden config files) - 1tb, and a 4 tb for the photos.
right now she has a 1tb drive that gives constant warnings of 0 bytes, so I'll move her photos out of /home to the larger drive, do a backup of /home, then install LM 21.

My computer is a laptop, and I store my photos on a separate internal drive, and backup them to an external drive.

I'll be honest - I've been kinda lax about backups, but am 'changing my evil ways' starting today!


edit - all my drives are ssd
Last edited by 3pinner on Sat Jun 03, 2023 11:22 am, edited 1 time in total.
User avatar
spamegg
Level 14
Level 14
Posts: 5036
Joined: Mon Oct 28, 2019 2:34 am
Contact:

Re: Best Practice - storing large amounts of data

Post by spamegg »

I keep important data on separate drives, never on the drive where Mint is installed. This also has the upside that when a new major Mint version comes out, I can nuke it and reinstall without any worries.
3pinner
Level 3
Level 3
Posts: 128
Joined: Mon Mar 05, 2012 8:47 pm

Re: Best Practice - storing large amounts of data

Post by 3pinner »

spamegg - That's exactly what I'm thinking.
having a drive that is completely full and getting ready for a new install is a real headache
User avatar
Lady Fitzgerald
Level 15
Level 15
Posts: 5805
Joined: Tue Jan 07, 2020 3:12 pm
Location: AZ, SSA (Squabbling States of America)

Re: Best Practice - storing large amounts of data

Post by Lady Fitzgerald »

3pinner, I've used the same arrangement for years, even before I started using Linux: keeping data segregated from System files (OS, programs, etc.). I laptops that have only one drive, I used a separate partition for data and didn't keep any data in Home. In laptops and desktops than have more than one drive (like you, all my drives are SSDs), I put the System on its own drive and data on their own drive(s). This makes making and maintaining backups much simpler and faster since I can use different programs that are specifically best suited for backing up either System files or data files. I'm baffled as to why the default in Mint is to keep user data in Home instead of a separate drive or partition unless it's a carryover from the days when few computers, especially laptops, had only one drive.

On my current laptop I have a 512GB boot drive that has just Linux Mint installed on it (no data). My data is kept on three other drives, one 4TB drive and two 8TB drives (one 8TB drive is split into two 4TB partitions so I can use retired 4TB drives for external backup drives; I'm cheap that way).

My new desktop that I'll be picking up in two or three days will be set up pretty much the same way: one 512GB boot drive and three 8TB data drives (two will have 4TB partitions so, again, I can use retired 4TB drives for external backup drives; I've got those things coming out my ears and a...nother orifice).

I use Timeshift for my OS for the same purpose as that System Restore was used in Windows. I also use Recuezilla to make weekly images of my boot drive (I can use the images to fully restore my boot drive if it should completely fail).

I backup my data to external drives using FreeFileSync. I keep multiple backup drives for each data drive both at home and offsite in my safe deposit box at my credit union.
Jeannie

To ensure the safety of your data, you have to be proactive, not reactive, so, back it up!
billyswong
Level 8
Level 8
Posts: 2177
Joined: Wed Aug 14, 2019 1:02 am

Re: Best Practice - storing large amounts of data

Post by billyswong »

Lady Fitzgerald wrote: Sat Jun 03, 2023 11:54 am I'm baffled as to why the default in Mint is to keep user data in Home instead of a separate drive or partition unless it's a carryover from the days when few computers, especially laptops, had only one drive.
For most personal computers, they are built with one drive installed only. When the drive is small in capacity, it is easy for newbies to guess wrong how much space shall be dedicated to root partition and how much shall be left for /home. Ends up you can see people who followed "advices" or "tutorials" outside for installing Linux came in asking for help because the root partition become full and they can no longer boot up the computer correctly.

I don't know where you get the impression that PCs today are built with multiple drives everywhere.
User avatar
Lady Fitzgerald
Level 15
Level 15
Posts: 5805
Joined: Tue Jan 07, 2020 3:12 pm
Location: AZ, SSA (Squabbling States of America)

Re: Best Practice - storing large amounts of data

Post by Lady Fitzgerald »

billyswong wrote: Sat Jun 03, 2023 12:40 pm
Lady Fitzgerald wrote: Sat Jun 03, 2023 11:54 am I'm baffled as to why the default in Mint is to keep user data in Home instead of a separate drive or partition unless it's a carryover from the days when few computers, especially laptops, had only one drive.
For most personal computers, they are built with one drive installed only. When the drive is small in capacity, it is easy for newbies to guess wrong how much space shall be dedicated to root partition and how much shall be left for /home. Ends up you can see people who followed "advices" or "tutorials" outside for installing Linux came in asking for help because the root partition become full and they can no longer boot up the computer correctly.

I don't know where you get the impression that PCs today are built with multiple drives everywhere.
I don't have that impression and I never said I did. Desktops with room for only one drive are rare, if they exist at all. Most laptops nowadays will have room for more than just one drive with two drives probably being the most common. In the case of "one drive wonders", a separate partition can be used for data, something I mentioned being possible in the part of my post you improperly quoted above. In an earlier part of my post, which you chose not to quote, I even said I had used a separate partition for data in laptops that had only one drive.
Jeannie

To ensure the safety of your data, you have to be proactive, not reactive, so, back it up!
TaterChip
Level 5
Level 5
Posts: 632
Joined: Sat Apr 22, 2023 12:34 pm
Location: Everywhere USA

Re: Best Practice - storing large amounts of data

Post by TaterChip »

3pinner wrote: Sat Jun 03, 2023 10:14 am I'm looking for suggestions from those that have a LOT of data on their computer.
I'm setting up a new system for my wife. she has thousands of photos.
trying to decide if I want her to continue adding photos as part of her /home folder (on a 4tb drive)
or
installing a separate drive specifically for photos.
Seems backup would be easier if they were part of the /home folder. I could use Timeshift or Lucky Backup for that.

If I have a second drive for photos, I'd probably schedule two backups, one for system and/home (Timeshift for this one)
Lucky Backup to a different location for the photos, and set it up to backup only new & changed files.

Or am I making this too complicated??
What do you use for an insane amount of data?
Thanks
I am a full-time nature photographer. I have tens of thousands of images, and I keep them all on a separate external hard drive that is also backed up on two additional external hard drives.

I use FreeFileSync to make sure everything is backed up and duplicated across multiple drives. I also use FFS's versioning option instead of just over-writing the changed files.
MSI Steel series GL75 Leopard, i7-10750H, 64GB RAM ... LMDE6
User avatar
Lady Fitzgerald
Level 15
Level 15
Posts: 5805
Joined: Tue Jan 07, 2020 3:12 pm
Location: AZ, SSA (Squabbling States of America)

Re: Best Practice - storing large amounts of data

Post by Lady Fitzgerald »

TaterChip wrote: Sat Jun 03, 2023 1:32 pm ...I am a full-time nature photographer. I have tens of thousands of images, and I keep them all on a separate external hard drive that is also backed up on two additional external hard drives...
I wish more people were as smart as you are when it comes to backups!

TaterChip wrote: Sat Jun 03, 2023 1:32 pm ...I use FreeFileSync to make sure everything is backed up and duplicated across multiple drives...
So do I.

TaterChip wrote: Sat Jun 03, 2023 1:32 pm ...I also use FFS's versioning option instead of just over-writing the changed files.
I do the same thing. I'm not sure if FFS actually over-writes older files being replaced by newer versions or just deletes and replaces but the result is the same.

For those who don't know, instead of FFS deleting changed and deleted files from the destination drive when they differ from what's on the source file, FFS Versioning, when enabled, sends them to a user designated Versioning folder so the user can examine them to make sure they are to be deleted. This protects files that became corrupted or accidentally deleted on the source drive from being lost due to also being deleted on the destination drive. It's a feature I highly recommend using.

Also, FFS treats folders and files the same way. You can designate an entire drive or just a folder (or group of folders) as the source and destination.
Jeannie

To ensure the safety of your data, you have to be proactive, not reactive, so, back it up!
3pinner
Level 3
Level 3
Posts: 128
Joined: Mon Mar 05, 2012 8:47 pm

Re: Best Practice - storing large amounts of data

Post by 3pinner »

Lady Fitzgerald and TaterChip - Thank you both.
I started using a separate internal drive in my laptop, and will add an external drive for storage.I'll just keep newly downloaded images on the internal drive, moving them after I get around to processing them.
Both of you have some great insight. I'm not new to computers or photography, but any other trade or skill, I'm always learning!
Thanks!
billyswong
Level 8
Level 8
Posts: 2177
Joined: Wed Aug 14, 2019 1:02 am

Re: Best Practice - storing large amounts of data

Post by billyswong »

@Lady Fitzgerald

So you read my first and last sentence then ignore everything in between huh? I am *against* setting up multiple partitions for single drive computer here, especially as default setup disregarding the capacity of that drive. The computer case containing space to install extra drives is not my concern here because a default OS setup procedure should cater to the worst scenario.

In my opinion, backup isn't any harder to the whole /home folder vs /home partition. There is some minor convenience for "clean" OS reinstall, like one may skip a proper backup before reinstalling or testing new OS flavor. But such "clean" reinstall isn't really clean as one is carrying over user config without 100% sure they are compatible with the new OS and the new software therein. If one use it for multi-boot, then there will be further risk of mess up.

In HDD era, separate partitions usually mean separate lifetime for each of them. But in SSD era, this is no longer true. The age limits of multiple partitions on the same drive are shared. Sectors in SSD drive aren't mapped to physical NAND in a fixed manner but virtualized for wear leveling. Same drive = same risk.

So in conclusion: I disagree with your recommendation of setting up separate partitions for / and /home for single-drive computer, which is the common hardware configuration without paying extra. Even for people with multiple drives installed, I would still recommend the 2nd drive by default be a pure data drive, without pollution of user config in /home. User config should only be backed up by an user consciously knowing what exactly they are copying.
User avatar
Lady Fitzgerald
Level 15
Level 15
Posts: 5805
Joined: Tue Jan 07, 2020 3:12 pm
Location: AZ, SSA (Squabbling States of America)

Re: Best Practice - storing large amounts of data

Post by Lady Fitzgerald »

billyswong wrote: Sat Jun 03, 2023 10:31 pm @Lady Fitzgerald

So you read my first and last sentence then ignore everything in between huh?...
Good grief, you didn't understand anything I said! No, I did not read your first and last sentence and then ignore everything between. :roll:
billyswong wrote: Sat Jun 03, 2023 10:31 pm ...I am *against* setting up multiple partitions for single drive computer here, especially as default setup disregarding the capacity of that drive. The computer case containing space to install extra drives is not my concern here because a default OS setup procedure should cater to the worst scenario...
I agree with your opinion that the default OS setup procedure should cater to a worse case scenario however I do not believe the current procedure of mixing personal data files with System files is the way to do so. I don't see why the default setup procedure can't include an option to set up a separate data partition during installation. And where did you get the idea I suggested disregarding the capacity of a drive? I never said any such thing.

billyswong wrote: Sat Jun 03, 2023 10:31 pm ...In my opinion, backup isn't any harder to the whole /home folder vs /home partition...
First, I'm not talking about whole /home vs /home partition. I'm advocating leaving the /home folder intact as installed and moving personal data files (documents, folders, music, photos, etc.) to a separate data partition (in the case of a one drive computer) or data drive(s).

Second, backups of data and System files are best done with different kinds of programs. System files are best backed up with imaging programs since most cannot be simply copied and pasted. However, imaging data files means all have to be included in every image which results in enormous images which waste enormous amounts of drive space and take an unreasonable amount of time. Backing up data files with a folder/file syncing program that works differentially saves time, drive space, and, in the case of SSDs, reduces unnecessary drive writes.

billyswong wrote: Sat Jun 03, 2023 10:31 pm ...There is some minor convenience for "clean" OS reinstall, like one may skip a proper backup before reinstalling or testing new OS flavor. But such "clean" reinstall isn't really clean as one is carrying over user config without 100% sure they are compatible with the new OS and the new software therein. If one use it for multi-boot, then there will be further risk of mess up...
Where on Earth did all that come from? I wasn't talking about doing "clean" reinstalls or installs of a newer OS version. I was talking about backups which are used to restore things back to the way they were at the time the backup was made. You're comparing apples to kumquats, not to mention going off topic.

billyswong wrote: Sat Jun 03, 2023 10:31 pm ...In HDD era, separate partitions usually mean separate lifetime for each of them. But in SSD era, this is no longer true. The age limits of multiple partitions on the same drive are shared. Sectors in SSD drive aren't mapped to physical NAND in a fixed manner but virtualized for wear leveling. Same drive = same risk...
Again, where did all that come from? The purpose of the partitioning I was talking about has nothing to do with wear leveling, comparing SSDs to HDDs or comparing risks. BTW, SSDs don't have sectors; they have cells.

billyswong wrote: Sat Jun 03, 2023 10:31 pm ...So in conclusion: I disagree with your recommendation of setting up separate partitions for / and /home for single-drive computer, which is the common hardware configuration without paying extra. Even for people with multiple drives installed, I would still recommend the 2nd drive by default be a pure data drive, without pollution of user config in /home. User config should only be backed up by an user consciously knowing what exactly they are copying.
So in conclusion, you again are misinterpreting what I wrote. Again, was recommending installing or keeping the original installation as is and merely moving personal data (documents, music, etc.) to a separate data partition or data drive. With a separate data partition, no extra cost would be involved so where did that come from? You recommend, when multiple drives are installed, the second drive should be a pure data drive. That is exactly what I've been saying!

When backing up the system files, user configuration files located in /home can be included in the system backup. It's not rocket science, for crying out loud.
Jeannie

To ensure the safety of your data, you have to be proactive, not reactive, so, back it up!
User avatar
lsemmens
Level 11
Level 11
Posts: 3949
Joined: Wed Sep 10, 2014 9:07 pm
Location: Rural South Australia

Re: Best Practice - storing large amounts of data

Post by lsemmens »

My method, albeit slow, and potentially problematic. Is to copy (yes using drag and drop) my "critical" files across the network to one or two of my other computers. I also have a usb stick for the stuff I might need on the road.
Fully mint Household
Out of my mind - please leave a message
TaterChip
Level 5
Level 5
Posts: 632
Joined: Sat Apr 22, 2023 12:34 pm
Location: Everywhere USA

Re: Best Practice - storing large amounts of data

Post by TaterChip »

Lady Fitzgerald wrote: Sat Jun 03, 2023 3:09 pm I wish more people were as smart as you are when it comes to backups!
I had to learn the hard way...... but I did learn :lol:


For those who don't know, instead of FFS deleting changed and deleted files from the destination drive when they differ from what's on the source file, FFS Versioning, when enabled, sends them to a user designated Versioning folder so the user can examine them to make sure they are to be deleted. This protects files that became corrupted or accidentally deleted on the source drive from being lost due to also being deleted on the destination drive. It's a feature I highly recommend using.

Also, FFS treats folders and files the same way. You can designate an entire drive or just a folder (or group of folders) as the source and destination.
IMO, here is the best thing about FFS. Every file that it copies can be accessed by any file manager. It doesn't put your files in a program specific format.
MSI Steel series GL75 Leopard, i7-10750H, 64GB RAM ... LMDE6
TaterChip
Level 5
Level 5
Posts: 632
Joined: Sat Apr 22, 2023 12:34 pm
Location: Everywhere USA

Re: Best Practice - storing large amounts of data

Post by TaterChip »

3pinner wrote: Sat Jun 03, 2023 8:50 pm Lady Fitzgerald and TaterChip - Thank you both.
I started using a separate internal drive in my laptop, and will add an external drive for storage.I'll just keep newly downloaded images on the internal drive, moving them after I get around to processing them.
Both of you have some great insight. I'm not new to computers or photography, but any other trade or skill, I'm always learning!
Thanks!
You're welcome 3pinner

From a photography standpoint here is the file structure that I have found that works best for me. I have tried several different file labeling formats, it seems that for at least the last several years I have settled on this one. I am lucky enough that I get to travel the country as a nature photographer, so my folder labeling may be a bit different from others.

I start off with a year folder "2008", then drill it down from there. Below is an actual shoot folder.

2008-04_IL_SnakeRoad_CottonmouthPortrait

I start with the year-month_State of shoot_Location of shoot_description of shoot. Starting with the year-month puts all the shoot folders into a loose chronological order within the main year folder.

This naming convention allows me to see at a glance what's inside the folder. Since I am in the process of switching to Linux full time, I have started using DigiKam (instead of lightroom) to pull keywords from the images, and help me find the exact images that I am looking for. My file naming system, and now using DigiKam seems to be a great combination for finding images fast.

As it stands, I am currently working with 53,988 Files and 1,023 Folders. I doubt I could be as efficient without my system of naming and DigiKam sorting by keywords. Those numbers used to be a lot higher, but I recently went and removed all the RAW files that would never get processed for one reason or another.

I USED to do my photography like you're doing yours. I would put new shoots on my internal secondary drive. Then one day I forgot to backup and to compound things tragedy struck. I went to boot my computer (winXP) and it wouldn't boot. After reinstalling the OS, I went to access my secondary drive and it was unusable. I couldn't get the computer to read it. I can only speculate what happened. What I do know, is after I copied the shoot from the 3.5 floppy used by my Sony Mavica digital camera I formatted the disk. So now that shoot was lost forever. Back then, photography was a hobby instead of part of my business.

That's when I started looking for other ways to do things... lesson learned the hard way.

Now I have an external "working" photography drive where I start off with two folders. Processing and Portfolio. All new shoots go into processing and when done they get moved over to portfolio. The portfolio folder houses the file naming structure shown above.

When I am done for the day, I now use FreeFileSync to mirror the "working" photography drive (with versioning enabled) over to the first external backup. Once complete, I run it again over to the second external backup. For me this simplifies things, since I don't have to change the source location and I can backup everything in one step.

Hope this helps.
MSI Steel series GL75 Leopard, i7-10750H, 64GB RAM ... LMDE6
User avatar
Termy
Level 12
Level 12
Posts: 4248
Joined: Mon Sep 04, 2017 8:49 pm
Location: UK
Contact:

Re: Best Practice - storing large amounts of data

Post by Termy »

I put these things on separate drives. I don't want my HOME filled up to the brim like that. For one thing, it can make backups a huge pain. My HOME contains my usual configuration stuff, my immediate work and documents, music I actually listen to, and that's pretty much it. The rest is stored on other devices.
I'm also Terminalforlife on GitHub.
3pinner
Level 3
Level 3
Posts: 128
Joined: Mon Mar 05, 2012 8:47 pm

Re: Best Practice - storing large amounts of data

Post by 3pinner »

TaterChip wrote: Sun Jun 04, 2023 2:00 pm
3pinner wrote: Sat Jun 03, 2023 8:50 pm Lady Fitzgerald and TaterChip - Thank you both.
I started using a separate internal drive in my laptop, and will add an external drive for storage.I'll just keep newly downloaded images on the internal drive, moving them after I get around to processing them.
Both of you have some great insight. I'm not new to computers or photography, but any other trade or skill, I'm always learning!
Thanks!
You're welcome 3pinner

From a photography standpoint here is the file structure that I have found that works best for me. I have tried several different file labeling formats, it seems that for at least the last several years I have settled on this one. I am lucky enough that I get to travel the country as a nature photographer, so my folder labeling may be a bit different from others.

I start off with a year folder "2008", then drill it down from there. Below is an actual shoot folder.

2008-04_IL_SnakeRoad_CottonmouthPortrait

I start with the year-month_State of shoot_Location of shoot_description of shoot. Starting with the year-month puts all the shoot folders into a loose chronological order within the main year folder.

This naming convention allows me to see at a glance what's inside the folder. Since I am in the process of switching to Linux full time, I have started using DigiKam (instead of lightroom) to pull keywords from the images, and help me find the exact images that I am looking for. My file naming system, and now using DigiKam seems to be a great combination for finding images fast.

As it stands, I am currently working with 53,988 Files and 1,023 Folders. I doubt I could be as efficient without my system of naming and DigiKam sorting by keywords. Those numbers used to be a lot higher, but I recently went and removed all the RAW files that would never get processed for one reason or another.

I USED to do my photography like you're doing yours. I would put new shoots on my internal secondary drive. Then one day I forgot to backup and to compound things tragedy struck. I went to boot my computer (winXP) and it wouldn't boot. After reinstalling the OS, I went to access my secondary drive and it was unusable. I couldn't get the computer to read it. I can only speculate what happened. What I do know, is after I copied the shoot from the 3.5 floppy used by my Sony Mavica digital camera I formatted the disk. So now that shoot was lost forever. Back then, photography was a hobby instead of part of my business.

That's when I started looking for other ways to do things... lesson learned the hard way.

Now I have an external "working" photography drive where I start off with two folders. Processing and Portfolio. All new shoots go into processing and when done they get moved over to portfolio. The portfolio folder houses the file naming structure shown above.

When I am done for the day, I now use FreeFileSync to mirror the "working" photography drive (with versioning enabled) over to the first external backup. Once complete, I run it again over to the second external backup. For me this simplifies things, since I don't have to change the source location and I can backup everything in one step.

Hope this helps.
Thank you.
I use a similar folder system. I use Rapid Photo Downloader, it creates folders by date for me. I can also add location and subject that becomes part of the folder name. I use Darktable for processing. When I load the RAW images into Darktable, I can there also add ratings and description tags that are added to the image data. That way I can search by date, subject, location, or whatever tag I generated. I don't have any where near as many photos as you do, but I'm planning on it. What I was looking for is how do people like you handle a potentially vast number of files, and this system looks good

freeFileSync - between you and Lady Fitzgerald, I will look into that this week.

As for separate storage ,i am convinced now. I am going to see how I can implement that into my system. Problem is - I just spent a boatload on drives for my wife's desktop. I'll probably steal one of the new 4TB drives from her, make that my exterior Processing/Portfolio drive, and just get an exterior drive for her and backup whatever she has.

I don't want hijack my own thread any further so we could discuss this elsewhere - have you tried Darktable for sorting & processing?
thanks again!
red-striped-zebra
Level 3
Level 3
Posts: 139
Joined: Thu Feb 10, 2022 4:29 am

Re: Best Practice - storing large amounts of data

Post by red-striped-zebra »

mikeflan wrote: Sat Jun 03, 2023 10:33 amThe drive is always mounted and always available for access.
By 'mounted', you mean here in layman terms, your external is always connected by wire to the main computer.
Locked

Return to “Beginner Questions”