最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

python - implement a differentiable L0 regularizer to keras layer - Stack Overflow

programmeradmin3浏览0评论

What is the appropriate way to implement a differentiable variant of L0 regularizer (count the non-zero values in a Conv layer / matrix) to keras layer?

I was thinking of using r(x) = tanh(abs(f*x)) for x the matrix, where f is some factor that scales the range in which the result will be considered small (and tan(0.5) ~ 0.55, so for f=1 : r(0.55)=0.5).

  • The loss + regularizer result will be simply added for the training loop?
  • Is the result of the regularizer function divided by the size of the matrix (to get a mean, like the loss function)? I have not seen that in the example L1 or L2 class, there it is the sum. I want to have a function independent of layer size.

What is the appropriate way to implement a differentiable variant of L0 regularizer (count the non-zero values in a Conv layer / matrix) to keras layer?

I was thinking of using r(x) = tanh(abs(f*x)) for x the matrix, where f is some factor that scales the range in which the result will be considered small (and tan(0.5) ~ 0.55, so for f=1 : r(0.55)=0.5).

  • The loss + regularizer result will be simply added for the training loop?
  • Is the result of the regularizer function divided by the size of the matrix (to get a mean, like the loss function)? I have not seen that in the example L1 or L2 class, there it is the sum. I want to have a function independent of layer size.

https://keras.io/api/layers/regularizers/#creating-custom-regularizers

Share Improve this question asked Mar 19 at 16:00 pas-calcpas-calc 14510 bronze badges
Add a comment  | 

1 Answer 1

Reset to default 0

A possibility is to use this cost function L_a(x) = 1/(1+(a/x)²) which at x=a is 0.5, L_a(a)=0.5. (a is a threshold parameter)

发布评论

评论列表(0)

  1. 暂无评论