Windows users are supposed to use exFAT on 256GB volumes and stop questioning things.
(with 128KB cluster size which would give Mr. Plummer a heart attack)
How hard is it to buy a new TV recorder if you old one sucks?
The TV recorder works just fine with FAT32. Throwing away a perfectly fine DVB-T decoder/recorder just because
microsoft thinks it's better to use exFAT is bad for the environment.
Note: most of this post is irrelevant: for an unknown reason my brain switched to thinking the issue was with a memory card.You are overthinking it and trying to find malice where there is none.
It happened you have a quite niche issue, where all three of the following are true:Your decoder supports SDXC cards, but fails to support the standard file system of SDXC cards (exFAT). Alternatively: it only has support for SDHC, but you PEBCAK-ed and bought an SDXC card.The decoder misses the ability to create a non-standard FAT32, despite requiring it.You are trying to use a very outdated tool.
If you PEBCAK-ed, and the decoder advertises only SDHC support (or fails to mention the standard at all), be aware that SDXC card handling is unintended and untested. If the 32 GB limit of SDHC isn’t explicitly added, nothing stops the implementation from operating on a larger SDXC card as if it was SDHC. This may work, but may as well only be giving appearance of working — and then data being overwritten after reaching 32 GB, filesystem corruption, doing weird things with files etc.
The TV recorder works just fine with FAT32. Throwing away a perfectly fine DVB-T decoder/recorder just because
microsoft thinks it's better to use exFAT is bad for the environment.
Microsoft (nor anyone else) doesn't care about the environment unless its some Hot Issue that they could run a TV commercial about.
Easing creation of large FAT32 filesystems to support old hardware is not one of those, apparently
You are overthinking it and trying to find malice where there is none. It happened you have a quite niche issue, where all three of the following are true:- Your decoder supports SDXC cards, but fails to support the standard file system of SDXC cards (exFAT). Alternatively: it only has support for SDHC, but you PEBCAK-ed and bought an SDXC card.
- The decoder misses the ability to create a non-standard FAT32, despite requiring it.
- You are trying to use a very outdated tool.
If you PEBCAK-ed, and the decoder advertises only SDHC support (or fails to mention the standard at all), be aware that SDXC card handling is unintended and untested. If the 32 GB limit of SDHC isn’t explicitly added, nothing stops the implementation from operating on a larger SDXC card as if it was SDHC. This may work, but may as well only be giving appearance of working — and then data being overwritten after reaching 32 GB, filesystem corruption, doing weird things with files etc.
Why are you talking about SDXC cards?
Why are you talking about SDXC cards?
That’s a good question. I don’t know. My guess is somebody mentioned SD cards and my brain switched to this mode. I deleted most of the post, as it seems irrelevant.
However, the not deleted part still applies: you are overthinking it. Three decades ago, when even 32 GB would be seen as absurd idea, somebody failed to predict the future. What was a reasonable choice back then didn’t age well. But certainly not an intentional move from Microsoft to make you throw away a good decoder. Too many moving parts to fit together just right, each of them almost unpredictable, certainly not expected. I would rather believe the opposite: that Microsoft planned to limit the use of FAT32 before 2010 in an attempt to force a move to newer Windows.
Why are you talking about SDXC cards?
That’s a good question. I don’t know. My guess is somebody mentioned SD cards and my brain switched to this mode. I deleted most of the post, as it seems irrelevant.
However, the not deleted part still applies: you are overthinking it. Three decades ago, when even 32 GB would be seen as absurd idea, somebody failed to predict the future. What was a reasonable choice back then didn’t age well. But certainly not an intentional move from Microsoft to make you throw away a good decoder. Too many moving parts to fit together just right, each of them almost unpredictable, certainly not expected. I would rather believe the opposite: that Microsoft planned to limit the use of FAT32 before 2010 in an attempt to force a move to newer Windows.
the limit was put in in the mid/late 90's when hardrives were commonly a few 100MB
40GB hard disks became available near 2000 and Windows 98 supported creating and mounting FAT32 filesystems of this size just fine.
Windows 2000 also did, it just wouldn't let you format them through the GUI
Whatever reason there was, it must have been very contrived bordering with nonsensical back then and certainly nonsensical today. Plummer mentions wanting to keep cluster count below 4 million so perhaps there was some notable implementation in the wild which had this limitation or they found that Windows performance sucks too badly with so many clusters and tried to push people to use NTFS instead, something like that.
Fun trivia: I have a photo camera which I think loads the whole FAT (the data structure) into RAM. It takes a long time to boot if the card is formatted with too many clusters and with even more clusters it keeps rebooting randomly under normal use. It doesn't matter what the card's capacity is, cluster count makes the difference.
40GB hard disks became available near 2000 and Windows 98 supported creating and mounting FAT32 filesystems of this size just fine.
Windows 2000 also did, it just wouldn't let you format them through the GUI
And those two UIs were created by different people at different times.
The reason was 'pick a number which sounds reasonable for your bounds'. There's not much more depth to this, it was one box out of thousands being put together, apparently with the expectation that it would all get a second pass later which never occured.
Three decades ago, when even 32 GB would be seen as absurd idea, somebody failed to predict the future.
What was a reasonable choice back then didn’t age well. But certainly not an intentional move from Microsoft to make you throw away a good decoder.
I didn't say that, it was a question because Magic suggested:
How hard is it to buy a new TV recorder if you old one sucks?
Because you asked how an ordinary non-technical consumer is supposed to use Windows. That's the way, of course
Resuming, Linux not only saved the day but also the environment (or at least a tiny bit)
I'm just puzzled why people sometimes say that windows is so easy to use for non-technical people.
The tools work just fine from a Windows only perspective -- they will format in fat32 up to 32 GB and either NTFS or exfat beyond that. Any version of Windows since XP or WinCE 6 supports both. From the MS perspective of only caring about windows, it even makes sense. Fat32 was not great for a lot of use cases, and gets worse on larger filesystems. By making people "choose" the "right" filesystem for their application they "saved" them from a choice that might cause performance or functionality problems in the future.
Formatting a drive for use in a non windows system was just not a consideration, and the people who say "windows is easier to use" mean in this context.
Formatting a drive for use in a non windows system was just not a consideration, and the people who say "windows is easier to use" mean in this context.
I ask again my question that I posted above. If the drive is going to be written to by a TV recorder device, shouldn't the TV recorder device be expected to format the drive according to its requirements?
yes and no
Some systems do reformat the drive to suit their needs, some add their own compression on the drives, witch in turn render them un-clonable or can't be backed up .... depends of the maker choice
normally you would be hit by the famous 4gb single file limitation if you are not in ntfs format, some do split the file .... really depend ..
most boxes or recorders are nix or droid based ... and can record big files if needed, and you can access them ... that's what i have
but Canadian companies like Bell, Videotron and others are closed source and closed drive, you can not read them, nor put them outside the box and back them ...... simply reformat them anew ....
Not sure if it's a GUI on top of command line tools or a GUI and command line tools on top of C/C++ libraries and APIs.
More or less the second one. Both GUI and command line tools call APIs to do their work.
Windows is a collection of often competing APIs. Since NT4 for all practical purposes the GUI is the shell.
The CLI is (merely) a command processor that runs within the graphical shell. Even in installation, recovery and headless server modes the graphics subsystem starts first. You can't boot directly to a CLI as far as I know.
here's why it ended up with 32GB, from the horses mouth; https://youtu.be/bikbJPI-7Kg?si=hWx5hdDppKhhQecj&t=365
am i listening this correct? if i do, OP should be thankful to M$... maybe he forgot who invented
exFAT? (or FAT in general) how not thankful of him?
i just formatted my 64GB (i dont have larger) pendrive with FAT32 (archaic FAT) its charming, so the claim is bollock unless you want to keep repartitioning drives with command line...
I can’t tell, if Mechatrommer’s message is sarcastic or not.
Yes, you can create a FAT32 file system larger than 32 GB. In fact the limit should be 128 GB, not 32 GB, but even Plummer himself no longer remembers the actual reasoning behind the decision. So we’ll never know, why it’s 32 GB. And you can go beyond 128 GB with no issues, with many tools — including Windows itself and Linux’s vfat module — not experiemcing any trouble.
Nobody ever claimed otherwise and Microsoft’s own tools, built into Windows, allow you to do so. The problem affects a single, horrendously outdated tool from the 1990s. A tool which should’ve be removed from the distribution 20 ago. But that would probably end with a massive outcry and claims Microsoft tries to force people into NTFS.(1)
“Thankful” is certainly not the right word here. If anything, that would be following normal design principles. Nobody is thankful for normal job being done. I’m not thankful there are brakes in my car or that my computer PSU comes with a standard IEC 60320 socket. You may find such limitation inadequate, but this is a prime example of historian’s fallacy.
If one finds segmented memory or the 640K limit absurd, I strongly suggest reading period books. Norton’s pink shirt guide makes it look elegant and much more reasonable.
exFAT is a good example of FOSS counterpart to greenwashing, so not sure, how to interpret this part.
(1) I don’t mean the crowd wishing to stay with open solutions. This part I do understand and support. No, you would hear whining mostly from Windows users.
“Thankful” is certainly not the right word here. If anything, that would be following normal design principles.
i mean all generations of FATs are designed by M$, i wonder why Linux not making its own FAT? like Android (unreadable disk format by Windows) anyway what i meant is exFAT is also designed by M$, if they dont open source it until today, i guess Linux is still in dark FAT32 age, thats where the "Thankful" comes from. but i dont follow history very well about this, i could be wrong. btw what i hate about NTFS is just because it has "System Volume Information" directory that i cant get rid of... other than that, i dont care FAT32 exFAT or NTFS.
and also what i meant is, if anybody cant format Flash drive into FAT32 in Windows, there are free external tools that can do that and even more, he doesnt learn deep enough or he only like to develop false sentiment. but if he expects empty Windows can do that well... i dont find empty Windows (without softwares) is usefull anyway, we dont install Windows because of Windows, we install it so that we can run softwares in it. so get used to that notion. if not because of Wine, imho Linux is nothing (for me) other than web server or web browser and some FOSS softwares, not even has complete devices drivers. ymmv and cheers.
The real story:
M$ beancounters saw that 40GB disks are right behind the horizon and thought it would be cool if those disks used NTF$.
They came up with some crazy pseudotechnical justification for crippling FAT32 and sold it to a random autistic developer.
The poor autistic developer, being autistic, took it at face value and implemented a workaround that we know today.
(Simply breaking large FAT32 support in NT would backfire as Windows 95 was already creating such volumes in the wild).
A few year later, €xFAT comes out, almost identical to FAT32, but supports large volumes and is lighter than NTF$.
The autistic developer doesn't even remember their pseudotechnical justification anymore and apologizes on YouTube.
Meanwhile beancounters are rubbing their hands...
M$ beancounters saw that 40GB disks are right behind the horizon and thought it would be cool if those disks used NTF$.
They came up with some crazy pseudotechnical justification for crippling FAT32 and sold it to a random autistic developer.
plummer in the video denied that. then who's the autistic developer?
I forgot to add this earlier.
On “Linux saves the day”. Since the principles are not something specific to Windows, similar situations are found elsewhere too, including tools typically used in Linux distros. An example? Try creating a 8192 bit RSA key in GnuPG. 🐧 “4 Kib is enough for everybody!” 🐧