fitgabor

A simple tool to find the Gabor filter that maximizes the output of your neuron model

https://github.com/mohammadbashiri/fitgabor

Science Score: 44.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (8.6%) to scientific vocabulary
Last synced: 6 months ago · JSON representation ·

Repository

A simple tool to find the Gabor filter that maximizes the output of your neuron model

Basic Info
  • Host: GitHub
  • Owner: mohammadbashiri
  • Language: Python
  • Default Branch: master
  • Size: 6.18 MB
Statistics
  • Stars: 10
  • Watchers: 1
  • Forks: 8
  • Open Issues: 1
  • Releases: 0
Created over 5 years ago · Last pushed about 2 years ago
Metadata Files
Readme Citation

README.md

Fit Gabor

If you have a neuron model and you want to find the gabor filter that maximizes the activity of your model neuron, this tool might come handy - it finds the input arguments to a gabor-generating function via gradient descent.

Example

1. We need a neuron model

Let's create a simple linear-nonlinear neuron model with a known receptive field: ``` python import torch from torch import nn from torch.nn import functional as F

class Neuron(nn.Module): def init(self, rf): super().init() self.rf = torch.tensor(rf.astype(np.float32))

def forward(self, x):
    return F.elu((x * self.rf).sum()) + 1

from fitgabor.utils import gabor_fn

theta = -np.pi/4 groundtruthrf = gaborfn(theta, sigma=4, Lambda=14, psi=np.pi/2, gamma=1, center=(15, 5), size=(64, 64)) neuron = Neuron(groundtruth_rf) ``` Here is the ground truth RF:

2. Initialize and train the gabor generator

``` python from fitgabor import GaborGenerator, trainer_fn

gaborgen = GaborGenerator(imagesize=(64, 64)) gaborgen, _ = trainerfn(gabor_gen, neuron) ```

You can generate the learned gabor simply by calling the model: python learned_gabor = gabor_gen().data.numpy()

And here is the learning evolution:

Cite as

@software{Bashiri_fitgabor_2020, author = {Bashiri, Mohammad}, month = dec, title = {{fitgabor}}, url = {https://github.com/mohammadbashiri/fitgabor}, version = {0.0}, year = {2020} }

Tried it and didn't work?

Let me know, please! I would love to know why it did not work and help fix it :construction_worker:

Owner

  • Name: Mohammad Bashiri
  • Login: mohammadbashiri
  • Kind: user
  • Location: Tübingen, Germany
  • Company: Uni Tübingen

Graduate student at Sinz Lab (sinzlab.org)

Citation (CITATION.cff)

cff-version: 1.2.0
message: "If you use this software, please cite it as below."
authors:
  - family-names: Bashiri
    given-names: Mohammad
    orcid: https://orcid.org/0000-0001-7901-0735
title: "fitgabor"
version: 0.0
url: https://github.com/mohammadbashiri/fitgabor
date-released: 2020-12-03

GitHub Events

Total
Last Year

Dependencies

Dockerfile docker
  • sinzlab/pytorch v3.9-torch1.9.0-cuda11.1-dj0.12.7 build
docker-compose.yml docker
  • fitgabor latest
setup.py pypi