[prev in list] [next in list] [prev in thread] [next in thread] 

List:       gentoo-user
Subject:    Re: [gentoo-user] Re: Finally got a SSD drive to put my OS on
From:       Dale <rdalek1967 () gmail ! com>
Date:       2023-04-20 10:59:13
Message-ID: 205e4510-ce86-48c0-02c1-2dcfe11b8532 () gmail ! com
[Download RAW message or body]

Peter Humphrey wrote:
> On Thursday, 20 April 2023 10:29:59 BST Dale wrote:
>> Frank Steinmetzger wrote:
>>> Am Wed, Apr 19, 2023 at 06:32:45PM -0500 schrieb Dale:
>>>> Frank Steinmetzger wrote:
>>>>> <<<SNIP>>>
>>>>>
>>>>> When formatting file systems, I usually lower the number of inodes from
>>>>> the
>>>>> default value to gain storage space. The default is one inode per 16 kB
>>>>> of
>>>>> FS size, which gives you 60 million inodes per TB. In practice, even one
>>>>> million per TB would be overkill in a use case like Dale’s media
>>>>> storage.¹
>>>>> Removing 59 million inodes × 256 bytes ≈ 15 GB of net space for each TB,
>>>>> not counting extra control metadata and ext4 redundancies.
>>>> If I ever rearrange my
>>>> drives again and can change the file system, I may reduce the inodes at
>>>> least on the ones I only have large files on.  Still tho, given I use
>>>> LVM and all, maybe that isn't a great idea.  As I add drives with LVM, I
>>>> assume it increases the inodes as well.
>>> I remember from yesterday that the manpage says that inodes are added
>>> according to the bytes-per-inode value.
>>>
>>>> I wonder.  Is there a way to find out the smallest size file in a
>>>> directory or sub directory, largest files, then maybe a average file
>>>> size???
>>> The 20 smallest:
>>> `find -type f -print0 | xargs -0 stat -c '%s %n' | sort -n | head -n 20`
>>>
>>> The 20 largest: either use tail instead of head or reverse sorting with
>>> -r.
>>> You can also first pipe the output of stat into a file so you can sort and
>>> analyse the list more efficiently, including calculating averages.
>> When I first run this while in / itself, it occurred to me that it
>> doesn't specify what directory.  I thought maybe changing to the
>> directory I want it to look at would work but get this: 
>>
>>
>> root@fireball /home/dale/Desktop/Crypt # `find -type f -print0 | xargs
>> -0 stat -c '%s %n' | sort -n | head -n 20`
>> -bash: 2: command not found
>> root@fireball /home/dale/Desktop/Crypt #
>>
>>
>> It works if I'm in the / directory but not when I'm cd'd to the
>> directory I want to know about.  I don't see a spot to change it.  Ideas.
> In place of "find -type..." say "find / -type..."
>


Ahhh, that worked.  I also realized I need to leave off the ' at the
beginning and end.  I thought I left those out.  I copy and paste a
lot.  lol 

It only took a couple dozen files to start getting up to some size. 
Most of the few small files are text files with little notes about a
video.  For example, if building something I will create a text file
that lists what is needed to build what is in the video.  Other than a
few of those, file size reaches a few 100MBs pretty quick.  So, the
number of small files is pretty small.  That is good to know. 

Thanks for the command.  I never was good with xargs, sed and such.  It
took me a while to get used to grep.  ROFL 

Dale

:-)  :-) 

[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic