Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Activations per channel are allowed in FINN?#1274

Unanswered
wortelio asked this question inQ&A
Discussion options

Hi,
First of all, thanks for your great work! I have 2 related questions, that I post here.

1st)
I have been able to deploy several networks in FINN and I was exploring more advanced configurations. I tried activations per channel in Brevitas, with following setup for QuantIdentity and QuantReLU:

QuantReLU(act_quant=Uint8ActPerTensorFloat,bit_width=act_bit_width,per_channel_broadcastable_shape=(1,oup,1,1),scaling_stats_permute_dims=(1,0,2,3),scaling_per_output_channel=True)

I tried many absorb/reorder transformations to convert the remaining Mul node, with no success, as activations QuantNodes are not scalar/1D anymore, but tensors. Is activation per channel supported by FINN? If so, which is the sequence of transformations to streamline the model and absorb those Mul nodes, as it happens when activation is per tensor?

finn_per_channel_act

2nd)
I have observed that BatchNorm sometimes holds values really close to zero. After Multithreshold absorb them and layers are converted to fpgadataflow, the weightDataType of the Thresholding is very high to keep the results, increasing a lot the LUT usage.

Somehow, I tried the per channel activation to solve it, but it did not work. Is there any way to limit the decimals of those values? Using BatchNorm2dToQuantScaleBias in Brevitas? Limiting in Pytorch? It worked for me configuring QuantReLU as relu6, with min and max val and fixed scale, but causing undesirable accuracy drop.

batchnorm_LUT

You must be logged in to vote

Replies: 0 comments

Sign up for freeto join this conversation on GitHub. Already have an account?Sign in to comment
Category
Q&A
Labels
None yet
1 participant
@wortelio

[8]ページ先頭

©2009-2025 Movatter.jp